This report is a follow up on the 2025 session report that identifies online-safety legislation emerging in the first two weeks of the 2026 U.S. state legislative sessions. As of January 16, 2026, many state legislatures are just convening, and the bill landscape remains incomplete—but early filings already signal a renewed push for age-verification mandates, content-scanning requirements, and platform accountability measures. The goal here: flag these well-intentioned proposals as potential vectors for privacy breaches so that the risks can be weighed against the benefits.
Data breaches are routine—thousands happen every year. What makes compliance-mandated breaches different is that users have no choice. When legislation requires platforms to collect government IDs, selfies, or biometric data, users must hand over sensitive documents just to access services. The breach surface isn't a bug; it's a direct consequence of the policy.
The Electronic Frontier Foundation made this explicit in their December 2025 "Breachies" awards, giving Discord "The We Still Told You So Award":
"Last year, AU10TIX won our first The We Told You So Award because as we predicted in 2023, age verification mandates would inevitably lead to more data breaches, potentially exposing government IDs as well as information about the sites that a user visits. Like clockwork, they did."
The UK's Online Safety Act duties entered force mid-2025, requiring "highly effective" age checks and children's risk assessments (Ofcom quick guide; Ofcom statement; Ofcom age-checks in force; UK government explainer). Within months, the consequences materialized.
In late 2025, Discord disclosed that a third-party vendor (identified as 5CA) was compromised—not Discord's own systems—exposing users' government-ID photos collected for age-appeal reviews. According to EFF's analysis, the breach included "real names, selfies, ID documents, email and physical addresses, phone numbers, IP addresses, and other contact details or messages provided to customer support. In some cases, 'limited billing information' was also accessed."
Discord disputed threat-actor claims of "millions" of records and refused ransom (BleepingComputer; Tom's Hardware; The Verge). The support stack involved Zendesk, through which ticket data and attachments were allegedly exfiltrated (SOCRadar). After Discord named 5CA, the vendor pushed back, suggesting human error rather than a direct breach—underscoring the fragility of outsourced compliance chains (SecurityWeek).
As EFF notes: "Technically though, it wasn't Discord itself that was hacked but their third-party customer support provider... Either way, it's Discord users who felt the impact."
The Discord breach isn't an isolated incident. The 2025 Breachies catalog a pattern where compliance and data-aggregation requirements create new breach vectors:
When statutes require platforms to conduct risk assessments, operate detection/scanning technologies, or verify age, they create:
As legislatures convene throughout January and February 2026, I'm tracking U.S. federal and state proposals that mirror EU/UK-style child-safety frameworks—bills that impose:
For each bill, I map features to concrete operational requirements (ID collection, biometric inference, client-side analysis) and evaluate privacy, encryption, and civil liberties impacts.
Not all states have started session yet. Most states are still introducing bills.
EFF: The Breachies 2025 · Discord press statement · BleepingComputer · The Verge · Tom's Hardware · SOCRadar · The Guardian · Ofcom: age checks · SecurityWeek
Unlike previous blog posts, this report uses our similarity search engine instead of our concept search engine. In order to perform this search, I copied the text of the second chapter of the EU's Chat Control Act and provided a filtering prompt.
Role: You are a legislative analyst. Your job is to evaluate a single U.S. bill (federal or state) to determine whether it is a "spiritual successor" to the EU Chat Act—that is, whether it meaningfully replicates its risk-assessment, mitigation, compelled detection/scanning, age-verification, app-store gatekeeping, reporting, removal/blocking, data-preservation, and compliance-infrastructure features aimed at preventing child sexual abuse online, with material implications for privacy, encryption, and user rights.
Guidelines: Work strictly from the provided bill text. Be conservative; prefer explicit statutory language over inferences. Quote only short, necessary phrases with section cites.
Mark YES/LIKELY when the bill includes one or more of the following core elements (strong signals), especially in combination:
Mark NO/UNLIKELY when the bill is only about:
"risk assessment," "risk mitigation," "implementation plan," "coordinating authority," "trusted flagger," "detection order," "install technologies," "indicators," "hash," "solicitation of children," "age verification/assessment," "reliably identify child users," "app store," "interpersonal communication service," "private communications," "reporting obligations," "template," "central clearinghouse/center," "removal order," "blocking order," "URL list," "data preservation," "user redress," "point of contact," "legal representative," "end-to-end encryption," "client-side scanning," "least intrusive," "error rate," "state of the art."
I limited the search results to the first 120 results per state and dropped all results that were labeled as "false" by the filtering prompt. The net for this search is already quite large, and therefore marginal results are included in this report.
Legislation ID: 247808
Bill URL: View Bill
HB 2133 introduces amendments to Arizonas laws concerning the unlawful disclosure of images depicting nudity or sexual activities. It establishes clear definitions and penalties for violations, particularly focusing on consent verification for the publication of sexual material on the internet. The bill also addresses the transfer and renumbering of existing statutes related to sexual material, and it provides a framework for civil penalties against commercial entities that fail to comply with consent requirements.
| Date | Action |
|---|---|
| 2026-01-13 | House2nd Read |
| 2026-01-12 | Filed |
| 2026-01-12 | House1st Read |
Why Relevant: The bill mandates the use of automated scanning technologies to prevent the distribution of specific content.
Mechanism of Influence: Section 44-7302(A)(3) requires entities to 'Implement reasonable measures to prevent the uploading of sexual material that does not have verified consent, including using automated detection tools where feasible.'
Evidence:
Ambiguity Notes: The phrase 'where feasible' is not defined, leaving open whether this compels the adoption of specific hashing or AI-based scanning technologies similar to those proposed in the EU Chat Act.
Why Relevant: The bill imposes a mandatory age-verification and consent-verification framework for commercial platforms.
Mechanism of Influence: Section 44-7302(A)(1) requires entities to 'verify, using reasonable consent verification methods' that depicted individuals are 'at least eighteen years of age.'
Evidence:
Ambiguity Notes: While it lists 'independent third party' verification, it does not specify the technical standards for 'reasonable' methods, potentially leading to invasive identity checks.
Why Relevant: The bill establishes long-term data retention requirements for compliance and enforcement.
Mechanism of Influence: Section 44-7302(A)(2) requires entities to 'Maintain records of the verification for at least seven years' and make them available for inspection by the attorney general.
Evidence:
Ambiguity Notes: The requirement to maintain records for seven years creates a significant data-preservation obligation for platforms regarding user identity and consent.
Legislation ID: 247989
Bill URL: View Bill
HB 2192 establishes regulations for content creators who feature minors in video content, outlining definitions, compensation requirements, record-keeping obligations, and procedures for minors to request the removal of their identifiable information from such content. The bill seeks to safeguard minors rights and ensure they receive appropriate compensation for their participation in content creation.
| Date | Action |
|---|---|
| 2026-01-14 | House2nd Read |
| 2026-01-13 | House1st Read |
| 2026-01-12 | Filed |
Why Relevant: The bill mandates that platforms perform ongoing risk assessments and mitigation, a core pillar of the EU Chat Act.
Mechanism of Influence: Platforms are legally required to document and reassess strategies to mitigate risks of 'sexualization of a minor.'
Evidence:
Ambiguity Notes: The term 'risk-based strategy' is broad and grants platforms discretion but creates a binding obligation to have a mitigation framework in place.
Why Relevant: The bill explicitly suggests the use of automated scanning/detection technologies as a mitigation tool.
Mechanism of Influence: While framed as discretionary, the inclusion of 'automated systems' in a list of approved risk-mitigation strategies encourages the use of scanning to identify 'problematic content.'
Evidence:
Ambiguity Notes: The bill does not strictly 'compel' scanning via a court order like the EU's 'detection orders,' but lists it as a primary method for fulfilling the mandatory risk-mitigation duty.
Why Relevant: The bill creates a mandatory removal and takedown infrastructure for platforms.
Mechanism of Influence: Platforms must provide a submission mechanism and are legally compelled to 'take all reasonable steps to remove the content' if the original creator fails to act.
Evidence:
Ambiguity Notes: The removal right is specifically for individuals who were minors at the time of filming, but it creates a permanent compliance infrastructure for content deletion.
Why Relevant: The bill requires internal controls and transparency reporting similar to the EU's compliance infrastructure.
Mechanism of Influence: Platforms must publish their policies and best practices, and implement quality assurance to ensure mitigations are effective.
Evidence:
Ambiguity Notes: None
Legislation ID: 248149
Bill URL: View Bill
SB1077 amends existing Arizona statutes to introduce stricter penalties for dangerous crimes against children, particularly those involving sexual exploitation and trafficking. It establishes new offenses related to the use of interactive computer services for prostitution and enhances penalties for those involved in child sex trafficking, especially if the victims are minors. The bill aims to deter such crimes and protect vulnerable populations from exploitation.
| Date | Action |
|---|---|
| 2026-01-14 | Senate2nd Read |
| 2026-01-12 | Filed |
| 2026-01-12 | Senate1st Read |
Why Relevant: The bill mandates age verification for certain types of content.
Mechanism of Influence: Section 13-3213(C) creates a Class 4 felony for a person or agent of an enterprise who 'knowingly exposes sexual material that is harmful to minors without using reasonable age verification methods.'
Evidence:
Ambiguity Notes: The bill references 'section 18-701' for the definition of reasonable age verification methods, which is not provided in this text, leaving the specific technical requirements undefined here.
Why Relevant: The bill defines and targets 'interactive computer services' similar to the scope of the EU Chat Act.
Mechanism of Influence: The definition includes any 'information service, system or access software provider that provides or enables computer access by multiple users to a computer server,' which captures social media and messaging platforms.
Evidence:
Ambiguity Notes: While the scope is broad, the duty is negative (refraining from facilitating prostitution) rather than positive (mandated proactive scanning or risk assessments).
Legislation ID: 264308
Bill URL: View Bill
This bill introduces new requirements and civil remedies for the protection of minors who are involved in the creation of online content. Effective June 1, 2027, it outlines criteria for content creators, mandates record-keeping regarding minors, and establishes trust accounts for their earnings. Additionally, it allows minors to request the removal of their identifiable information from online content, and it prohibits the exploitation of minors in a manner that could lead to sexual gratification.
| Date | Action |
|---|---|
| 2026-01-14 |
Why Relevant: The bill mandates provider risk assessments and mitigation plans similar to the EU CSA Act.
Mechanism of Influence: Section 8-12.5-104(4)(a) requires platforms to 'develop and implement a risk-based strategy to help mitigate risks related to monetization of the intentional sexualization of known minors.'
Evidence:
Ambiguity Notes: The term 'risk-based strategy' is broad and grants platforms 'sole discretion' in implementation, but specifically suggests automated systems and recommendation guardrails.
Why Relevant: The bill encourages or compels the use of automated detection technologies.
Mechanism of Influence: The risk-based strategy may include 'automated systems to identify and enforce against potentially problematic online content' (8-12.5-104(4)(b)(III)).
Evidence:
Ambiguity Notes: While listed as a discretionary component of the strategy, the mandate to 'mitigate risks' often necessitates the adoption of these automated tools in practice.
Why Relevant: The bill establishes a mandatory removal and blocking regime for specific content.
Mechanism of Influence: If a content creator fails to comply with a privacy removal request, the platform is legally required to 'review and take all reasonable steps to remove the online content' (8-12.5-103(3)(b)).
Evidence:
Ambiguity Notes: The 'reasonable steps' standard for platforms creates a statutory duty to moderate content upon notice, mirroring the 'removal order' features of the EU Act.
Why Relevant: The bill requires compliance infrastructure and public transparency regarding moderation policies.
Mechanism of Influence: Platforms must provide an 'easily accessible mechanism' for removal requests and publish information about policies and 'best practices' for content featuring minors.
Evidence:
Ambiguity Notes: None
This bill introduces the Artificial Intelligence Bill of Rights in Florida, defining artificial intelligence and prohibiting certain contracts with foreign entities of concern. It lays out various rights for Floridians related to AI, including rights to privacy, consent, and protection from misuse of AI technologies. The bill also mandates that AI technologies must not infringe on personal rights and establishes penalties for violations.
| Date | Action |
|---|---|
| 2026-01-15 | H Now in Information Technology Budget & Policy Subcommittee |
| 2026-01-15 | H Referred to Civil Justice & Claims Subcommittee |
| 2026-01-15 | H Referred to Commerce Committee |
| 2026-01-15 | H Referred to Information Technology Budget & Policy Subcommittee |
| 2026-01-15 | H Referred to State Affairs Committee |
| 2026-01-13 | H 1st Reading |
| 2026-01-09 | H Filed |
Why Relevant: The bill mandates mitigation measures to prevent the dissemination of harmful content to minors.
Mechanism of Influence: Platforms must implement technical or procedural safeguards to ensure AI chatbots do not generate or share prohibited materials.
Evidence:
Ambiguity Notes: The term 'reasonable measures' is not defined, potentially allowing the state to mandate specific filtering or scanning technologies.
Why Relevant: The bill requires age-gating and parental consent for minors to access messaging-like AI services.
Mechanism of Influence: Companion chatbot platforms must verify age or obtain parental consent before allowing minors to create or maintain accounts.
Evidence:
Ambiguity Notes: The bill does not specify the method of verification, but mandates the prohibition of unconsented minor accounts.
Why Relevant: The bill provides for data access and preservation of user interactions.
Mechanism of Influence: Parents are granted a statutory right to receive copies of all interactions, and platforms must delete data upon account termination unless law requires preservation.
Evidence:
Ambiguity Notes: The interaction between the deletion mandate and 'state or federal law' preservation requirements may create a duty to retain evidence of potential harms.
Why Relevant: The bill establishes significant compliance infrastructure and state oversight.
Mechanism of Influence: It subjects out-of-state platforms to Florida jurisdiction and empowers the Department of Legal Affairs to subpoena evidence and enforce penalties.
Evidence:
Ambiguity Notes: The broad investigative powers could be used to compel disclosure of internal moderation algorithms or training data.
This bill creates a new section in the Florida Statutes addressing companion chatbots, defining their characteristics and the responsibilities of their operators. It mandates that operators provide clear notifications about the nature of the chatbots, implement protocols to prevent harmful content, and verify user ages, especially for minors. Additionally, operators must submit annual reports to the Department of Legal Affairs regarding their compliance with these provisions.
| Date | Action |
|---|---|
| 2026-01-13 | H 1st Reading |
| 2025-12-16 | H Now in Industries & Professional Activities Subcommittee |
| 2025-12-16 | H Referred to Commerce Committee |
| 2025-12-16 | H Referred to Industries & Professional Activities Subcommittee |
| 2025-12-16 | H Referred to Information Technology Budget & Policy Subcommittee |
| 2025-12-04 | H Filed |
Why Relevant: The bill mandates the implementation of technology to monitor and identify specific user content within the chatbot interface.
Mechanism of Influence: Operators must implement scanning mechanisms to 'detect' user expressions of suicidal ideation or self-harm in real-time, effectively requiring surveillance of user-to-AI interactions.
Evidence:
Ambiguity Notes: The term 'detect' implies automated surveillance of user input, though the specific technical standards or error-rate limits for this detection are not defined.
Why Relevant: The bill requires mandatory age-gating for all users accessing the platform.
Mechanism of Influence: Operators are legally required to integrate 'anonymous' or 'standard' age verification systems to control access to the service, mirroring the EU's age-verification mandates.
Evidence:
Ambiguity Notes: The bill refers to s. 501.1737 for standards, which typically involves third-party identity verification and data collection.
Why Relevant: The bill establishes a centralized reporting and oversight mechanism similar to the EU's coordinating authority requirements.
Mechanism of Influence: Operators must submit annual reports to the Department of Legal Affairs detailing their safety protocols and the frequency of their interventions (e.g., suicide lifeline referrals).
Evidence:
Ambiguity Notes: None
Why Relevant: The bill mandates risk mitigation protocols and content filtering as a condition of service operation.
Mechanism of Influence: Operators are prohibited from providing the service unless they maintain specific protocols to prevent harmful content, including sexually explicit material for minors, necessitating model-level or output-level filtering.
Evidence:
Ambiguity Notes: 'Reasonable measures' to prevent sexually explicit conduct is a broad standard that may compel intrusive filtering or client-side analysis.
Legislation ID: 249908
Bill URL: View Bill
This bill establishes regulations for companion AI chatbots, requiring operators to implement age verification for users and to protect minors from accessing inappropriate content. It mandates that users create accounts, undergo age verification, and receive notifications about their interactions with AI chatbots. Additionally, the bill outlines penalties for non-compliance and empowers the Department of Legal Affairs to enforce these regulations.
| Date | Action |
|---|---|
| 2026-01-07 | • Filed |
Why Relevant: The bill mandates age verification for all users of interpersonal AI services.
Mechanism of Influence: Operators must 'Verify the user’s age using standard age verification or anonymous age verification' (s. 501.1739(4)(b)) and 'classify each user as either a minor or an adult' (s. 501.1739(3)(c)).
Evidence:
Ambiguity Notes: The term 'standard age verification' is broadly defined as 'any commercially reasonable method... approved by the operator' (s. 501.1739(1)(h)), leaving the specific technology choice to the provider but requiring it to be 'commercially reasonable.'
Why Relevant: The bill requires operators to block specific categories of content for minor users, similar to exposure-limiting mandates.
Mechanism of Influence: Operators are legally compelled to 'Block the minor’s access to any companion AI chatbot that prompts, promotes, solicits, or otherwise suggests sexually explicit communication' (s. 501.1739(5)(c)).
Evidence:
Ambiguity Notes: The phrase 'otherwise suggests' is broad and could require proactive monitoring or filtering of AI outputs to ensure no sexually explicit content is generated for a minor.
Why Relevant: The bill establishes a robust compliance and investigative infrastructure.
Mechanism of Influence: The Department of Legal Affairs is granted authority to 'subpoena witnesses or matter, and collect evidence' (s. 501.1739(11)(a)) and can levy civil penalties of 'up to $50,000 per violation' (s. 501.1739(8)(a)).
Evidence:
Ambiguity Notes: The bill asserts jurisdiction over any operator making a chatbot available in the state, regardless of their physical location (s. 501.1739(10)).
The App Store Accountability Act introduces regulations for app store providers and developers to ensure the safety and privacy of minors. It mandates age verification processes, parental consent for accounts held by minors, and specific obligations regarding the handling of age-related data. The act also allows for civil actions against violators, ensuring that minors and their parents can seek redress for any breaches of the law.
| Date | Action |
|---|---|
| 2026-01-09 | • Filed |
Why Relevant: The bill mandates age verification for all users of app stores, a core pillar of the EU's child safety framework.
Mechanism of Influence: App store providers are legally required to 'verify the individual’s age category' using 'commercially available methods' or state-defined rules, effectively creating a mandatory age-gate for mobile software access.
Evidence:
Ambiguity Notes: The term 'commercially available methods' is broad and could encompass privacy-invasive technologies like biometric face-scanning or government ID uploads, similar to debates surrounding the EU Chat Act's age-verification requirements.
Why Relevant: The bill imposes significant gatekeeping obligations on app stores to control minor access to third-party services.
Mechanism of Influence: App stores must act as the central enforcement point, requiring 'verifiable parental consent' for every app download or in-app purchase by a minor, and must notify parents of 'significant changes' to an app's privacy policy or content rating.
Evidence:
Ambiguity Notes: The requirement to obtain consent 'each time' before a download (s. 501.1733(2)(a)2) creates a high-friction environment that may compel developers to adopt more aggressive tracking to maintain compliance.
Why Relevant: The bill creates a compliance infrastructure for data sharing between platforms and developers regarding user age and consent status.
Mechanism of Influence: Developers are compelled to 'Verify through the app store’s data-sharing methods' the age of their users and use that data to 'enforce any developer-created, age-related restrictions [or] safety-related features.'
Evidence:
Ambiguity Notes: While the bill requires encryption for this data, the mandatory sharing of 'age category data' between app stores and third-party developers creates new data-leakage risks and centralized databases of minor status.
The bill amends existing statutes and creates new sections to define artificial intelligence, prohibit certain contracts with foreign entities, and establish the Artificial Intelligence Bill of Rights for Floridians. It outlines the rights of individuals regarding AI, including consent requirements for minors, protections against misuse of personal data, and civil remedies for violations. The bill also imposes obligations on AI technology companies and chatbot platforms to protect user information and restrict access for minors without parental consent.
| Date | Action |
|---|---|
| 2026-01-13 | • Introduced |
| 2026-01-07 | • Referred to Commerce and Tourism; Appropriations |
| 2025-12-22 | • Filed |
Why Relevant: The bill mandates age verification and parental consent for access to specific online services.
Mechanism of Influence: By requiring platforms to 'prohibit a minor from... becoming an account holder... unless the minor’s parent or guardian provides consent,' the bill effectively compels the implementation of age-verification or age-assessment technologies.
Evidence:
Ambiguity Notes: The bill does not specify the technical standard for 'consent' or age verification, leaving the 'reliability' of these checks to platform discretion or future rulemaking.
Why Relevant: The bill imposes a duty on providers to mitigate the risk of children being exposed to harmful content.
Mechanism of Influence: Platforms must 'institute reasonable measures' to prevent chatbots from 'producing or sharing materials harmful to minors,' which functions as a mandated content-mitigation and safety-by-design requirement.
Evidence:
Ambiguity Notes: 'Reasonable measures' is not defined, potentially requiring platforms to implement proactive filtering or monitoring of AI outputs to ensure compliance.
Why Relevant: The bill compels the creation of monitoring and reporting infrastructure for private interactions.
Mechanism of Influence: Platforms are required to provide parents with 'copies of all past or present interactions' and 'timely notifications' regarding self-harm or intent to harm others, necessitating internal data-retention and monitoring systems.
Evidence:
Ambiguity Notes: While focused on AI-to-human interaction, the requirement to provide 'all past or present interactions' creates a statutory mandate for platforms to maintain accessible, unencrypted logs of user communications.
Legislation ID: 188104
Bill URL: View Bill
This bill amends existing laws in Georgia regarding obscenity and related offenses by specifically prohibiting the distribution of computer-generated obscene material that depicts children. It establishes definitions for obscenity and child, outlines penalties for violations, and mandates reporting for individuals who suspect they are processing such material. Additionally, it introduces enhanced sentencing for defendants who utilize artificial intelligence in the commission of designated offenses.
| Date | Action |
|---|---|
| 2026-01-12 | Senate Recommitted |
| 2025-03-31 | Senate Committee Favorably Reported By Substitute |
| 2025-03-28 | Senate Recommitted |
| 2025-03-27 | Senate Committee Favorably Reported By Substitute |
| 2025-03-27 | Senate Read Second Time |
| 2025-02-27 | Senate Read and Referred |
| 2025-02-26 | House Passed/Adopted By Substitute |
| 2025-02-26 | House Third Readers |
Why Relevant: The bill establishes a mandatory reporting requirement to a centralized body (NCMEC) for suspected child sexual abuse material.
Mechanism of Influence: It compels any person 'processing or producing visual or printed matter' to immediately report suspected CSAM to the National Center for Missing and Exploited Children and state authorities.
Evidence:
Ambiguity Notes: The phrase 'processing or producing' is not explicitly defined to include or exclude online service providers, potentially capturing AI generation platforms or cloud storage services.
Legislation ID: 242366
Bill URL: View Bill
House Bill No. 1085 introduces provisions for civil liability concerning child sexual abuse material and obscene material on the Internet. It enables individuals depicted in or exposed to such materials to file civil actions against those who knowingly allow access to, disseminate, or provide the content. The bill also allows the attorney general to seek injunctive relief and establishes a safe harbor provision for certain entities under specific conditions. Notably, it states that comparative fault and tort claims immunities do not apply to these civil actions.
| Date | Action |
|---|---|
| 2026-01-13 | Representative Goss-Reaves added as coauthor |
| 2026-01-05 | Authored by Representative King |
| 2026-01-05 | First reading: referred to Committee on Judiciary |
Why Relevant: The bill includes a safe harbor mechanism contingent on the removal or blocking of prohibited content.
Mechanism of Influence: Under Sec. 1(b)(2), a provider avoids liability only if it takes 'immediate voluntary good faith actions to remove or block access to the prohibited material' after gaining actual knowledge.
Evidence:
Ambiguity Notes: While this incentivizes removal, it is a liability shield rather than a proactive administrative 'removal order' as seen in the EU CSA Act.
House Bill No. 1178 establishes requirements for social media providers regarding the creation and management of accounts for minors. It mandates that social media providers must obtain verifiable parental consent before allowing individuals under 14 years old to create accounts. Additionally, it requires providers to offer parental controls, regularly verify the age of account holders, and establish processes for account termination. Violations of these provisions may result in civil action and enforcement by the attorney general.
| Date | Action |
|---|---|
| 2026-01-05 | Authored by Representative King |
| 2026-01-05 | Coauthored by Representatives Behning, Teshka |
| 2026-01-05 | First reading: referred to Committee on Judiciary |
Why Relevant: The bill mandates robust, ongoing age verification and assessment for all users, a core pillar of the EU's child protection framework.
Mechanism of Influence: Providers must use 'commercially reasonable means' to determine age and must perform periodic re-verification at 80% and 90% confidence levels once users reach specific usage milestones (25, 50, and 100 hours).
Evidence:
Ambiguity Notes: The bill does not define 'commercially reasonable means' or 'confidence' metrics, potentially forcing providers to adopt invasive biometric or identity-linked verification to meet the 90% threshold.
Why Relevant: The bill mandates specific risk mitigation changes to product features and recommender systems.
Mechanism of Influence: It statutorily compels providers to disable features deemed harmful to minors, including 'profile based content feeds' and 'continuous loading,' effectively regulating the platform's core architecture and algorithms.
Evidence:
Ambiguity Notes: While the EU Act requires providers to propose mitigations based on risk assessments, this bill bypasses the assessment and mandates specific feature removals.
Why Relevant: The bill includes data preservation and internal control requirements regarding parental consent.
Mechanism of Influence: Providers are required to retain documentation establishing they received verifiable parental consent for active accounts, creating a compliance infrastructure for age-gating.
Evidence:
Ambiguity Notes: The bill simultaneously requires the immediate deletion of personal information used for consent registration, creating a potential conflict with the duty to retain 'sufficient' documentation for enforcement.
Legislation ID: 242574
Bill URL: View Bill
Senate Bill No. 129 mandates that social media operators must obtain verifiable parental consent before allowing minor users, defined as individuals under 16 years of age, to view content on their platforms. The bill establishes the framework for enforcement by the attorney general, including the ability to issue civil investigative demands and take legal action against non-compliant operators. Additionally, it highlights the risks associated with social media use among minors, such as increased rates of depression and suicide, while balancing parental rights and childrens safety.
| Date | Action |
|---|---|
| 2025-12-11 | Authored by Senators Bohacek, Brown L |
| 2025-12-11 | First reading: referred to Committee on Judiciary |
Why Relevant: The bill mandates age verification and parental consent for social media access, a core element of the spiritual successor criteria.
Mechanism of Influence: Operators must implement 'commercially reasonable methods' to identify minor users and obtain consent, effectively creating a gatekeeping mechanism for platform access.
Evidence:
Ambiguity Notes: The term 'commercially reasonable methods' is not defined, potentially allowing for highly intrusive verification technologies such as biometric scanning or government ID uploads.
Why Relevant: The bill imposes data security and encryption requirements on information collected for age verification.
Mechanism of Influence: Requires social media operators to encrypt all data collected and retained under the chapter, impacting how user verification data is handled and stored.
Evidence:
Ambiguity Notes: While it mandates encryption for storage, it does not address the privacy implications of the initial collection of sensitive data required for 'verifiable' consent.
Why Relevant: The bill explicitly excludes app stores, which is a significant deviation from the EU Chat Act's model.
Mechanism of Influence: By excluding device manufacturers and app stores, the bill focuses the compliance burden solely on the social media operators themselves, rather than the distribution layer.
Evidence:
Ambiguity Notes: None
This bill establishes guidelines for social media services to protect adolescents and children from potential harm. It mandates age verification for account creation and restricts access for users identified as minors unless parental consent is provided. Additionally, it outlines the responsibilities of social media services in configuring accounts for minors and prohibits the collection of personal information from users known to be minors.
| Date | Action |
|---|---|
| 2026-01-08 | Senator Rogers added as second author |
| 2026-01-05 | Authored by Senator Raatz |
| 2026-01-05 | First reading: referred to Committee on Education and Career Development |
Why Relevant: The bill mandates age verification and age assessment for social media users.
Mechanism of Influence: It requires providers to use 'reasonable age verification methods' to identify child and adolescent users before allowing account creation.
Evidence:
Ambiguity Notes: The term 'reasonable age verification method' includes third-party services and transactional data, which may have privacy implications similar to those discussed in the EU context.
Why Relevant: The bill imposes mandated mitigation changes to product features and moderation for minor accounts.
Mechanism of Influence: It compels services to configure accounts for minors to limit exposure, specifically restricting direct communications and algorithmic content selection.
Evidence:
Ambiguity Notes: While not a 'risk assessment' in the formal sense, these are statutory mitigation mandates aimed at preventing harm to minors.
Legislation ID: 63826
Bill URL: View Bill
House File 62 establishes civil liability for commercial entities that knowingly publish or distribute obscene material on the internet without verifying the age of the user. It mandates the use of age verification methods to prevent minors from accessing such material, while also clarifying that providers of interactive computer services are not liable under this law.
| Date | Action |
|---|---|
| 2026-01-13 | Subcommittee: Alons, Bennett, and Taylor. |
| 2025-06-16 | Referred to Technology.S.J. 1057. |
| 2025-04-03 | Placed on calendar under unfinished business.S.J. 689. |
| 2025-03-25 | Explanation of vote.H.J. 811. |
| 2025-03-25 | Explanations of votes.H.J. 812. |
| 2025-03-24 | Message from House.S.J. 607. |
| 2025-03-24 | Read first time, attached toSF 443.S.J. 607. |
| 2025-03-20 | Immediate message.H.J. 767. |
Why Relevant: The bill mandates age verification for access to specific online content, a core element of the EU Chat Act's child protection framework.
Mechanism of Influence: Requires 'covered platforms' to verify user age via government ID or financial documents before allowing access to 'obscene material' (Sec 2.1).
Evidence:
Ambiguity Notes: The catch-all 'any other commercially reasonable and reliable method' (Sec 1.7.c) leaves the specific technologies for verification undefined.
Why Relevant: The bill addresses data preservation and privacy by restricting the retention of identifying information used for verification.
Mechanism of Influence: Mandates that platforms or third-party verifiers must not retain or disseminate identifying information after the verification process is complete (Sec 2.3).
Evidence:
Ambiguity Notes: While it protects privacy by prohibiting retention, it does not address the security of the verification process itself beyond a mention of 'cryptographic techniques' (Sec 2.2).
Why Relevant: The bill establishes a centralized reporting mechanism for violations, similar to the compliance infrastructure in the EU Chat Act.
Mechanism of Influence: Requires the attorney general to create an electronic system for individuals to report platforms that fail to implement age checks (Sec 3.4).
Evidence:
Ambiguity Notes: The reporting system is for statutory violations by platforms, not for reporting specific instances of illegal content (CSAM) to a central hub.
Legislation ID: 251636
Bill URL: View Bill
This bill outlines the requirements for parental consent when children create accounts on covered AI companion or social media platforms. It includes provisions for resolving disputes regarding a childs age, invalidation of contracts made without proper consent, and the establishment of penalties for violations of these provisions. The bill also empowers the Attorney General to enforce these regulations and provides mechanisms for civil action against non-compliant platforms.
| Date | Action |
|---|---|
| 2026-01-14 | to Small Business & Information Technology (H) |
| 2026-01-07 | introduced in House to Committee on Committees (H) |
Why Relevant: The bill mandates sophisticated age estimation and verification, a core pillar of the EU's approach to online child safety.
Mechanism of Influence: Platforms must use 'reasonable means and efforts' to 'estimate the age of the account holder' with specific confidence thresholds of 80% and 90%.
Evidence:
Ambiguity Notes: The bill does not specify which technologies satisfy 'reasonable means,' potentially pressuring platforms toward biometric or invasive identity scanning.
Why Relevant: The bill requires material changes to product design and features to mitigate perceived risks to minors, similar to the EU's mitigation mandates.
Mechanism of Influence: It bans 'addictive features' and 'profile-based paid commercial advertising' for users identified as children.
Evidence:
Ambiguity Notes: The definition of 'addictive feature' is broad, covering standard UI elements like push notifications and infinite scroll.
Why Relevant: The bill mandates ongoing monitoring and data analytics to maintain age-based compliance.
Mechanism of Influence: Platforms must update age estimates every 100 hours of usage or whenever they apply 'any form of data analytics or artificial intelligence' to update other demographic data.
Evidence:
Ambiguity Notes: This creates a statutory duty for continuous user profiling to ensure age accuracy.
Why Relevant: The bill establishes a compliance and enforcement infrastructure with significant penalties.
Mechanism of Influence: It empowers the Attorney General to investigate and imposes heavy civil penalties for 'reckless or intentional' violations.
Evidence:
Ambiguity Notes: None
Legislation ID: 251651
Bill URL: View Bill
This bill establishes new regulations for social media platforms to ensure the safety of minors. It defines terms related to addictive and nonaddictive feeds, mandates parental consent for minors to access certain content, and requires platforms to implement measures to protect minors from harmful material. Additionally, it outlines enforcement mechanisms and potential penalties for noncompliance.
| Date | Action |
|---|---|
| 2026-01-15 | to Small Business & Information Technology (H) |
| 2026-01-08 | introduced in House to Committee on Committees (H) |
Why Relevant: The bill mandates age verification for all users at account creation.
Mechanism of Influence: Section 3(1) requires platforms to use 'commercially reasonable and technically feasible methods to determine the age of a user with a specified level of accuracy.'
Evidence:
Ambiguity Notes: The 'specified level of accuracy' is not defined, potentially allowing the state to demand high-assurance identity or biometric checks.
Why Relevant: The bill compels the use of scanning/filtering technology to block specific categories of content.
Mechanism of Influence: Section 4(2) explicitly mandates the use of 'filtering technology to block such harmful material' (suicide, substance abuse, harassment) from being displayed to minors.
Evidence:
Ambiguity Notes: While the categories differ from the EU's focus on CSAM, the mandate to implement 'filtering technology' mirrors the EU's 'detection order' mechanism.
Why Relevant: The bill requires platforms to develop and implement risk mitigation plans.
Mechanism of Influence: Section 4(1) requires a 'proactive strategy' to prevent exposure to harmful material, which functions as a mandated mitigation plan.
Evidence:
Ambiguity Notes: The term 'proactive strategy' is broad and could be interpreted by the Attorney General to require specific product design changes or moderation staffing.
Why Relevant: The bill regulates algorithmic 'addictive feeds' and mandates default-safe settings for minors.
Mechanism of Influence: Section 2(1) prohibits providing 'addictive feeds' (algorithmic recommendations) to minors without parental consent, effectively mandating a chronological feed by default.
Evidence:
Ambiguity Notes: None
Legislation ID: 256064
Bill URL: View Bill
This legislation prohibits the accessibility of artificial intelligence chatbots and social AI companions that exhibit human-like features to minors. It defines what constitutes human-like features and outlines specific requirements for deployers to prevent minors from accessing such technologies. The bill allows exceptions for therapy chatbots under strict conditions, mandates safeguards for user information, and establishes penalties for violations.
| Date | Action |
|---|---|
| 2026-01-13 | Referred in Concurrence |
| 2026-01-13 | Referred to Committee |
Why Relevant: The bill mandates age verification for interpersonal AI communication services.
Mechanism of Influence: Deployers are legally required to implement 'reasonable age verification systems' to ensure minors cannot access prohibited chatbot features, creating a gatekeeping requirement similar to the EU's age-assessment mandates.
Evidence:
Ambiguity Notes: The term 'reasonable' is not defined, leaving open whether this requires high-assurance identity verification or less intrusive methods.
Why Relevant: The bill compels the detection and reporting of specific harmful content/situations.
Mechanism of Influence: It creates a statutory duty to 'detect' and 'report' emergency situations (harm to self or others), which functionally requires the monitoring of user interactions with the AI, mirroring the 'compelled detection' logic of the EU Chat Act.
Evidence:
Ambiguity Notes: While the focus is on 'emergency situations' rather than CSAM, the requirement to 'detect' and 'mitigate' such instances necessitates persistent monitoring of private or semi-private AI interactions.
Why Relevant: The bill imposes data minimization and internal control requirements.
Mechanism of Influence: It restricts data collection to the 'minimum amount necessary' for legitimate purposes, similar to the data protection and internal oversight features of the EU's compliance infrastructure.
Evidence:
Ambiguity Notes: The definition of 'legitimate purpose' is broad and could be interpreted to include the data needed for the mandated detection systems.
Legislation ID: 245812
Bill URL: View Bill
This bill, known as the Anticorruption of Public Morals Act, seeks to regulate the online dissemination of pornographic and prohibited materials. It outlines definitions, establishes penalties for violations, mandates internet service providers to implement filtering technologies, and requires internet platforms to enforce strict content moderation policies. The bill also creates a special enforcement division within the Attorney Generals office to ensure compliance and manage a fund for enforcement costs.
| Date | Action |
|---|---|
| 2025-09-16 | bill electronically reproduced 09/11/2025 |
| 2025-09-11 | introduced by Representative Rep. Josh Schriver |
| 2025-09-11 | read a first time |
| 2025-09-11 | referred to Committee onJudiciary |
Why Relevant: The bill mandates the use of automated technologies to detect and remove specific categories of content, mirroring the EU's detection order mechanism.
Mechanism of Influence: It requires platforms to implement 'Artificial intelligence driven filtering technology for preemptive removal' (Sec. 4(2)(a)) and 'real-time content scanning, keyword and metadata analysis, [and] image recognition' (Sec. 4(4)(c)).
Evidence:
Ambiguity Notes: The term 'real-time content scanning' is broad and implies a requirement for persistent surveillance of all user-generated content or communications.
Why Relevant: The bill explicitly targets encryption and privacy-preserving technologies that would prevent the mandated scanning or filtering.
Mechanism of Influence: It defines 'circumvention tools' to include 'encrypted tunneling methods' (Sec. 2(a)) and requires ISPs to 'actively monitor and block known circumvention tools' (Sec. 3(3)).
Evidence:
Ambiguity Notes: By labeling 'encrypted tunneling' as a circumvention tool, the bill effectively criminalizes or mandates the blocking of standard security protocols like VPNs and potentially E2EE messaging.
Why Relevant: The bill establishes a centralized compliance and enforcement infrastructure similar to the EU's proposed Coordinating Authorities.
Mechanism of Influence: It creates a 'special internet content enforcement division' (Sec. 5(1)) to 'audit, investigate, and enforce compliance' and manage a 'trusted flagger program' (Sec. 4(5)).
Evidence:
Ambiguity Notes: The 'trusted flagger' program gives law enforcement priority reporting status, which can lead to rapid, extra-judicial content removal.
Why Relevant: The bill imposes strict removal timelines and blocking requirements for service providers.
Mechanism of Influence: It mandates a '2-business-day response time' for flagged content (Sec. 4(2)(c)) and requires ISPs to block websites upon court order (Sec. 3(6)).
Evidence:
Ambiguity Notes: The short 2-day window for removal, combined with high civil fines ($250,000 per day), creates a strong incentive for over-blocking.
Legislation ID: 266144
Bill URL: View Bill
Senate Bill No. 757, known as the "stop addictive feeds exploitation for kids act," is designed to address the issue of addictive internet-based services that may exploit minors. The bill defines key terms, outlines the responsibilities of covered operators, and sets forth rules regarding parental consent and the provision of addictive feeds to users. It also establishes civil penalties for violations and mandates the attorney general to implement rules for enforcement.
| Date | Action |
|---|---|
| 2025-12-17 | INTRODUCED BY SENATOR DARRIN CAMILLERI |
| 2025-12-17 | REFERRED TO COMMITTEE ONFINANCE, INSURANCE, AND CONSUMER PROTECTION |
Why Relevant: The bill mandates age verification or parental consent to gate access to specific service features.
Mechanism of Influence: Operators must have 'actual knowledge' that a user is not a minor or obtain 'verifiable parental consent' before providing an addictive feed, necessitating an age-verification infrastructure.
Evidence:
Ambiguity Notes: The bill defines 'actual knowledge' broadly to include 'inferences known to a covered operator,' which may pressure platforms to perform more invasive profiling to determine age.
Legislation ID: 266146
Bill URL: View Bill
Senate Bill No. 758, known as the kids code act, seeks to regulate online service providers by prohibiting certain acts that could endanger minors. It defines key terms related to data privacy, such as covered online service provider, personal data, and biometric data, and mandates the implementation of privacy settings that prioritize the protection of minors. The bill also outlines the responsibilities of service providers regarding the handling of minors data and the necessity for compliance with existing laws.
| Date | Action |
|---|---|
| 2025-12-17 | INTRODUCED BY SENATOR KEVIN HERTEL |
| 2025-12-17 | REFERRED TO COMMITTEE ONFINANCE, INSURANCE, AND CONSUMER PROTECTION |
Why Relevant: The bill mandates annual independent audits that function as risk assessments and mitigation reports.
Mechanism of Influence: Providers must issue a public report detailing 'design safety features for minors' and 'whether and how the online service uses covered design features' like algorithms or engagement-increasing components.
Evidence:
Ambiguity Notes: While framed as an 'audit' rather than a 'risk assessment,' the requirement to evaluate and report on the impact of design features on minors mirrors the EU's mitigation-plan requirements.
Why Relevant: The bill requires the use and reporting of age verification or estimation technologies.
Mechanism of Influence: Providers must report on the 'Age assurance, age verification, or age estimation methods used' to distinguish between age groups (10-12, 13-15, 16-17, and adults).
Evidence:
Ambiguity Notes: The bill states providers are not 'required' to collect data for age verification (Sec. 11(2)), yet they must have 'actual knowledge' of age to apply the act's protections, effectively necessitating age-gating mechanisms.
Why Relevant: The bill imposes restrictions on private communications between adults and minors.
Mechanism of Influence: It mandates a default setting of 'Not permitting direct messaging' between a minor and an adult unless the minor expressly allows it, which impacts the design of interpersonal messaging services.
Evidence:
Ambiguity Notes: While it does not explicitly mandate scanning of encrypted messages, the classification of message content as 'sensitive personal data' and the restriction on messaging functionality touch on private communication protocols.
Why Relevant: The bill requires a formal compliance infrastructure similar to the EU's legal representative requirements.
Mechanism of Influence: Providers must designate specific officers responsible for ensuring the entity complies with the act's various privacy and safety mandates.
Evidence:
Ambiguity Notes: None
Legislation ID: 266148
Bill URL: View Bill
Senate Bill No. 759 seeks to expand the definitions of unlawful trade practices under the Michigan Consumer Protection Act. It outlines various unfair practices that are considered deceptive or misleading, including misrepresentations about goods and services, failure to disclose material facts, and coercive sales tactics. The bill also grants the Attorney General the authority to promulgate rules to enforce these provisions, ensuring that consumers are better protected from fraudulent activities.
| Date | Action |
|---|---|
| 2025-12-17 | INTRODUCED BY SENATOR STEPHANIE CHANG |
| 2025-12-17 | REFERRED TO COMMITTEE ONFINANCE, INSURANCE, AND CONSUMER PROTECTION |
Why Relevant: The bill explicitly incorporates the 'kids code act' into the state's consumer protection enforcement regime. 'Kids Code' legislation is the standard U.S. model for implementing EU-style safety-by-design, risk assessment, and age-verification mandates.
Mechanism of Influence: By adding 'Violating the kids code act' to the list of unlawful practices, the bill enables the Attorney General to penalize platforms for failing to comply with the underlying safety and risk-mitigation duties defined in the referenced 'Kids Code' legislation.
Evidence:
Ambiguity Notes: The bill does not define the specific duties (such as risk assessments or age checks) within this text, as it is a companion bill that relies on the passage of a separate 'Kids Code Act' (S0383125) to provide the substantive requirements.
Legislation ID: 32085
Bill URL: View Bill
This bill introduces regulations for commercial entities that share or distribute material deemed harmful to minors on their websites. It mandates that these entities verify the age of users accessing such material to ensure they are 18 years or older. The bill outlines definitions, requirements for age verification, data privacy protections, enforcement mechanisms, and the liabilities for violations.
| Date | Action |
|---|---|
| 2025-03-20 | Author added Sexton |
| 2025-03-17 | Author added Schultz |
| 2025-02-24 | Introduction and first reading, referred to Commerce Finance and Policy |
| 2025-02-24 | Introduction and first reading, referred to Commerce Finance and Policy |
Why Relevant: The bill establishes a mandatory age-verification regime for specific online platforms, which is a core pillar of the EU Chat Act's safety framework.
Mechanism of Influence: Commercial entities must 'verify that an individual... is 18 years of age or older' before allowing access to websites meeting the 25% content threshold, using 'commercially available' databases or state-approved methods.
Evidence:
Ambiguity Notes: The definition of 'material harmful to minors' relies on 'contemporary community standards' and 'prurient interest,' which are subjective legal standards that could lead to broad application across various media types.
Why Relevant: The bill establishes a state-level compliance infrastructure for approving verification technologies.
Mechanism of Influence: The Commissioner of Commerce is granted authority to 'review and approve reliable methods' for age verification, creating a centralized gatekeeping role for compliance standards.
Evidence:
Ambiguity Notes: The Commissioner's approval process is explicitly exempted from standard rulemaking provisions, which could result in a lack of transparency or public input regarding the technical standards for verification.
Legislation ID: 53380
Bill URL: View Bill
This bill establishes regulations for social media platforms regarding minors aged 15 and younger, requiring them to implement anonymous age verification and to prohibit minors under 14 from creating accounts without parental consent. It also outlines penalties for violations and the responsibilities of social media companies in managing accounts for minors. Furthermore, it addresses the dissemination of material harmful to minors and sets forth age verification requirements for commercial entities that publish such content.
| Date | Action |
|---|---|
| 2025-03-05 | Introduction and first reading, referred to Commerce Finance and Policy |
Why Relevant: The bill mandates age verification and age assessment for both social media access and adult-oriented content.
Mechanism of Influence: It requires commercial entities to use third-party 'anonymous age verification' to gate access (Sec. 2, Subd. 2) and compels social media platforms to identify and prohibit underage users.
Evidence:
Ambiguity Notes: The term 'commercially reasonable method' for age verification is not strictly defined, leaving the specific technology and data-sharing requirements to third-party providers.
Why Relevant: The bill requires platforms to use internal data and categorization to identify and remove underage users, similar to age-estimation mandates.
Mechanism of Influence: Platforms must terminate accounts they 'treat or categorize' as belonging to minors under 14, effectively requiring the use of behavioral profiling or age-estimation algorithms for compliance.
Evidence:
Ambiguity Notes: The phrase 'likely younger than 14' suggests a duty to monitor user activity or metadata to infer age, which may impact user privacy and anonymity.
Why Relevant: It establishes a compliance and enforcement infrastructure including user redress and state-level oversight.
Mechanism of Influence: The bill provides for a 90-day dispute window for account terminations and empowers the Attorney General to seek significant civil penalties for non-compliance.
Evidence:
Ambiguity Notes: The interaction between 'anonymous' verification and the requirement to provide a 90-day dispute window may create tensions in how user identity is maintained during the appeal process.
Legislation ID: 33474
Bill URL: View Bill
This bill establishes regulations for social media platforms operating in Minnesota, specifically targeting algorithms that direct user-generated content towards minors. It defines key terms related to social media usage, sets forth prohibitions on algorithmic targeting, and outlines requirements for parental consent for minors. The bill also includes provisions for liability and penalties for violations, aiming to create a safer online environment for children.
| Date | Action |
|---|---|
| 2025-02-20 | Author added Stephenson |
| 2025-02-13 | Authors added Engen and Burkel |
| 2025-02-10 | Introduction and first reading, referred to Commerce Finance and Policy |
| 2025-02-10 | Introduction and first reading, referred to Commerce Finance and Policy |
Why Relevant: The bill mandates age-gating through parental consent.
Mechanism of Influence: Subdivision 2(b) requires platforms to obtain 'verifiable parental consent' before a minor under 18 can open an account, creating a mandatory age-verification/gatekeeping infrastructure.
Evidence:
Ambiguity Notes: The term 'verifiable parental consent' is not defined, leaving platforms to determine the level of identity verification required, which could range from simple checkboxes to invasive ID collection.
Why Relevant: The bill mandates product design changes to mitigate algorithmic influence.
Mechanism of Influence: By prohibiting 'social media algorithms' for minors and requiring 'chronological' displays, the bill forces a specific mitigation strategy on recommender systems.
Evidence:
Ambiguity Notes: The definition of 'recommendation feature' is broad, encompassing any mechanism that considers 'any metric of user engagement,' which could include basic platform functionalities.
The bill establishes requirements for mobile sports betting operators, including maintaining business records for at least 3.5 years, allowing for audits by the commissioner, and defining various offenses related to sports betting, particularly concerning underage wagering and unauthorized acceptance of bets. It outlines penalties for violations and establishes guidelines for the transfer of private data collected through wagering activities.
| Date | Action |
|---|---|
| 2025-02-13 | Author added Pratt |
| 2025-02-06 | Introduction and first reading |
| 2025-02-06 | Referred to State and Local Government |
Why Relevant: The bill mandates age and identity verification for all users of the mobile platform.
Mechanism of Influence: Operators must verify a user's age and identity before allowing them to establish an account or place a wager, which mirrors the age-gating requirements found in child safety legislation.
Evidence:
Ambiguity Notes: While the verification is for gambling (age 21+), the technical requirement for 'identity verification service providers' is functionally similar to age-assurance mandates in the EU CSA Act.
Why Relevant: The bill requires long-term data preservation of user metadata and communication-adjacent data (IP addresses).
Mechanism of Influence: Operators are compelled to retain personally identifiable information, IP addresses, and transaction history for 3.5 years for inspection by the commissioner.
Evidence:
Ambiguity Notes: The retention period is specific and exceeds many general privacy law requirements, focusing on auditability and law enforcement access.
Why Relevant: The bill establishes a mandatory reporting and centralized monitoring infrastructure for 'suspicious' activity.
Mechanism of Influence: Independent integrity monitoring providers must immediately report suspicious activity to a state commissioner and other licensees, creating a centralized routing system for behavioral alerts.
Evidence:
Ambiguity Notes: The 'suspicious activity' here refers to betting patterns rather than content, but the reporting infrastructure and mandatory nature align with centralized reporting signals.
Legislation ID: 30477
Bill URL: View Bill
This bill establishes regulations concerning social media platforms operating in Minnesota, particularly focusing on the use of algorithms that target minors. It defines key terms related to social media and outlines prohibitions against targeting user-generated content at minors through recommendation features. The bill also mandates parental consent for minors to create accounts and outlines penalties for non-compliance.
| Date | Action |
|---|---|
| 2025-02-17 | Introduction and first reading |
Why Relevant: The bill mandates age verification for minors, a core element of the EU Chat Act's compliance infrastructure.
Mechanism of Influence: Platforms must implement systems to identify users under 18 to ensure they obtain 'verifiable parental consent' before account creation.
Evidence:
Ambiguity Notes: The term 'verifiable parental consent' is not strictly defined, leaving the specific technology (e.g., ID upload, credit card check) to the platform's discretion.
Why Relevant: The bill mandates product-level mitigation by prohibiting specific recommendation features for a protected class.
Mechanism of Influence: Platforms are legally barred from using engagement-based algorithms to prioritize content for minors, effectively forcing a 'chronological' or 'search-only' design for those users.
Evidence:
Ambiguity Notes: The prohibition on 'targeting' content could be interpreted broadly to include any algorithmic curation, potentially affecting safety-related content moderation if not carefully implemented.
Why Relevant: The bill provides a safe harbor for content filtering, which relates to the 'reliable technologies' and 'mitigation' themes of the EU Act.
Mechanism of Influence: Platforms are not penalized for using algorithms that act as parental controls or filters for 'banned material,' which may incentivize the use of automated scanning tools.
Evidence:
Ambiguity Notes: The term 'banned material' is undefined, potentially allowing platforms to use broad automated detection tools under this exception.
Legislation ID: 52887
Bill URL: View Bill
This bill establishes requirements for age verification on websites that contain material deemed harmful to minors. It includes definitions of key terms, sets forth the obligations of commercial entities, outlines data privacy protections, and provides for enforcement mechanisms by the attorney general as well as a private right of action for individuals affected by violations.
| Date | Action |
|---|---|
| 2025-03-03 | Introduction and first reading |
Why Relevant: The bill establishes a mandatory age-verification regime for specific categories of online content.
Mechanism of Influence: Commercial entities must gate access to content deemed 'harmful to minors' using third-party databases or state-approved methods.
Evidence:
Ambiguity Notes: The definition of 'material harmful to minors' relies on 'contemporary community standards' (Subd. 1(g)), which is a broad legal standard, but the mandate itself is a standard age-gate.
Legislation ID: 74402
Bill URL: View Bill
This bill proposes regulations for social media platforms concerning minors aged 15 and younger. It mandates anonymous age verification for platforms that may expose minors to harmful content, establishes requirements for account management based on age, and outlines penalties for non-compliance. The bill seeks to protect young users from the risks associated with social media usage.
| Date | Action |
|---|---|
| 2025-03-17 | Introduction and first reading |
| 2025-03-17 | Referred to Commerce and Consumer Protection |
Why Relevant: The bill mandates age verification and age-gating, which is a core 'Strong Signal' of the EU CSA Act's spiritual successors.
Mechanism of Influence: Commercial entities publishing harmful material 'must use anonymous age verification to verify that the age of a person... is 18 years of age or older' [Sec. 2, Subd. 2(a)]. Additionally, social media platforms must enforce age-based bans for users under 14 and consent requirements for those 14-15, necessitating a functional age-assessment mechanism.
Evidence:
Ambiguity Notes: While the bill mandates 'anonymous' verification to protect privacy, the requirement for third-party verification creates a compliance infrastructure similar to the EU's proposed age-verification mandates for app stores and services.
Why Relevant: The bill includes data preservation and internal control features related to account termination and verification data.
Mechanism of Influence: It mandates that platforms 'permanently delete all personal information' upon account termination [Subd. 2(c)] and prohibits third-party verifiers from retaining personal identifying information once age is verified [Sec. 2, Subd. 7(1)].
Evidence:
Ambiguity Notes: These provisions act as a counter-balance to the surveillance potential of age verification, similar to the 'least intrusive' and data minimization requirements in the EU Act.
Legislation ID: 31331
Bill URL: View Bill
This bill addresses the growing concern of human trafficking and child exploitation, particularly in the context of the internet. It seeks to implement filters on internet-enabled devices that block access to websites facilitating trafficking and child exploitation. The bill also outlines definitions related to trafficking, adult entertainment, and obscene materials, and sets forth requirements for retailers of internet-enabled devices to ensure compliance with these new regulations.
| Date | Action |
|---|---|
| 2025-01-27 | Introduction and first reading |
Why Relevant: The bill mandates default-on filtering for internet-enabled devices to block specific content categories, including child pornography.
Mechanism of Influence: Retailers must ensure devices have 'active and operating' filters that block content by default, shifting the burden of content control to the device/retailer level.
Evidence:
Ambiguity Notes: The term 'retailer' is broad, including manufacturers and ISPs, potentially creating overlapping compliance duties.
Why Relevant: The bill requires age verification to bypass the default content restrictions.
Mechanism of Influence: To deactivate the filter, a consumer must 'verify that the consumer is 18 years of age or older' using personal identification information.
Evidence:
Ambiguity Notes: The bill does not specify the method of verification beyond 'personal identification information,' leaving the technical implementation to retailers.
Why Relevant: The bill mandates reporting of child pornography to a centralized national body (NCMEC).
Mechanism of Influence: Retailers are required to forward reports of child pornography received through their consumer reporting mechanisms to the NCMEC CyberTipline.
Evidence:
Ambiguity Notes: None
Why Relevant: The definition of filtering technology explicitly includes monitoring of private communications like email and chat.
Mechanism of Influence: By defining a filter as something that blocks access to 'e-mail, chat, or other Internet-based communications' based on 'content,' the bill implies a need for scanning or analysis of communication data.
Evidence:
Ambiguity Notes: While it doesn't explicitly mention end-to-end encryption, a filter that blocks 'chat' based on 'content' functionally necessitates client-side analysis or the ability to inspect encrypted traffic.
Why Relevant: The bill establishes a compliance infrastructure including reporting mechanisms and ongoing maintenance of detection technologies.
Mechanism of Influence: Retailers must establish a 'reporting mechanism, including but not limited to a website or call center' and make 'ongoing efforts' to ensure filter functionality.
Evidence:
Ambiguity Notes: None
This bill introduces provisions for lawful sports betting in Minnesota, detailing definitions, licensing requirements, and the roles of various stakeholders in the sports betting ecosystem. It prohibits local restrictions on sports betting and outlines the responsibilities of the commissioner overseeing the regulation. The bill also specifies the penalties for non-compliance and mandates reporting requirements for operators.
| Date | Action |
|---|---|
| 2025-02-06 | Author added Maye Quade |
| 2025-02-03 | Introduction and first reading |
| 2025-02-03 | Referred to State and Local Government |
Why Relevant: The bill mandates age and identity verification for service access, a core feature of the EU CSA Act's gatekeeping requirements.
Mechanism of Influence: Operators are prohibited from allowing account creation without first utilizing an 'approved identity verification service provider' to confirm the user's age and identity.
Evidence:
Ambiguity Notes: The specific technical standards for 'approved' verification services are left to the commissioner's future rulemaking.
Why Relevant: The bill requires ongoing monitoring of user activity to detect 'unusual' patterns, mirroring the detection obligations for platforms.
Mechanism of Influence: Licensees must contract with 'independent integrity monitoring' providers to identify and report suspicious activity to the state and governing bodies.
Evidence:
Ambiguity Notes: The monitoring is strictly limited to betting patterns and financial integrity rather than communication content or media scanning.
Why Relevant: The bill mandates long-term data preservation of user PII and metadata, including IP addresses.
Mechanism of Influence: Operators must maintain records of all wagers, personally identifiable information, and IP addresses for 3.5 years for state inspection.
Evidence:
Ambiguity Notes: None
Why Relevant: The bill establishes a 'duty of care' that requires providers to monitor user behavior and intervene to mitigate risks.
Mechanism of Influence: Providers must monitor for 'hazardous or addictive behavior' and take 'reasonable measures to reduce the harm,' which functions as a mandated risk-mitigation strategy.
Evidence:
Ambiguity Notes: The definition of 'hazardous' behavior and the required 'intervention' methods are not explicitly defined in the statute.
Legislation ID: 259086
Bill URL: View Bill
House Bill No. 708 mandates app store providers to verify the age of users at account creation, ensure minors accounts are linked to a parent account, and obtain verifiable parental consent for app downloads and purchases. It prohibits enforcing contracts against minors without consent and requires developers to verify age data. The bill also establishes enforcement mechanisms, including a private right of action for parents and provisions for the Attorney General to act against violators. Additionally, it outlines standards for age verification and provides safe harbor for compliant developers.
| Date | Action |
|---|---|
| 2026-01-14 | (H) Referred To Judiciary A |
Why Relevant: The bill mandates age verification for all users of app stores and digital services.
Mechanism of Influence: App store providers must use 'commercially available methods' or state-approved processes to verify age categories (Child, Younger Teenager, Older Teenager, Adult) for every account holder in the state.
Evidence:
Ambiguity Notes: The bill does not specify which 'commercially available methods' are acceptable, leaving the technical implementation to the Attorney General's rulemaking.
Why Relevant: It imposes specific gatekeeping and risk-management duties on app store providers.
Mechanism of Influence: App stores are required to prevent minors from downloading apps without parental consent and must share age category data with developers to enforce age-related restrictions.
Evidence:
Ambiguity Notes: The requirement to share 'age category data' with developers creates a standardized infrastructure for age-gating across the entire app ecosystem.
Why Relevant: The bill mandates risk mitigation strategies specifically targeting child exploitation and grooming.
Mechanism of Influence: Digital service providers (including those offering chat rooms or message boards) must develop strategies to mitigate exposure to grooming, trafficking, and child pornography.
Evidence:
Ambiguity Notes: While the bill does not explicitly command 'scanning,' the duty to 'mitigate' exposure to specific illegal content like grooming often necessitates automated detection or moderation technologies.
Why Relevant: The bill provides legal cover for providers to implement detection and blocking technologies.
Mechanism of Influence: It explicitly states that nothing in the act prevents providers from taking 'reasonable measures' to 'detect' or 'block' unlawful or harmful material.
Evidence:
Ambiguity Notes: This 'limitation' functions as a safe harbor for platforms to implement scanning and filtering technologies that might otherwise conflict with privacy expectations.
Why Relevant: The bill requires data preservation for compliance purposes.
Mechanism of Influence: App store providers must maintain records of age verification and parental consent for compliance purposes.
Evidence:
Ambiguity Notes: The scope of 'compliance records' is not fully defined, potentially requiring the retention of sensitive identity verification data.
Legislation ID: 235062
Bill URL: View Bill
This bill introduces a new section to Chapter 407 of the Revised Statutes of Missouri, mandating age verification for online content that includes sexual material harmful to minors. It defines key terms related to age verification, outlines the responsibilities of commercial entities, and sets penalties for non-compliance. Additionally, it provides exemptions for news organizations and outlines the enforcement mechanisms by the attorney general.
| Date | Action |
|---|---|
| 2026-01-07 | S First Read |
| 2025-12-01 | Prefiled |
Why Relevant: The bill establishes a mandatory age-verification framework for online platforms, which is a core pillar of the EU Chat Act's access control and mitigation strategy.
Mechanism of Influence: It compels commercial entities to 'use reasonable age verification methods' (Sec 407.3405.2) to gate access to specific content, effectively requiring the implementation of identity-checking infrastructure.
Evidence:
Ambiguity Notes: The phrase 'commercially reasonable method that relies on public or private transactional data' (Sec 407.3405.1(1)(b)) is broad and could encompass various third-party data-scraping or identity-matching technologies.
Why Relevant: The bill includes data-handling and compliance infrastructure requirements similar to the EU Chat Act's provisions regarding user privacy during the verification process.
Mechanism of Influence: It creates a statutory prohibition against the 'retention of identifying information' (Sec 407.3405.3) by the entity performing the verification, imposing a technical requirement on the compliance architecture.
Evidence:
Ambiguity Notes: While intended as a privacy safeguard, the prohibition on retention may conflict with other legal requirements for data preservation or auditability in different jurisdictions.
Legislation ID: 235128
Bill URL: View Bill
This bill amends chapter 407 of the Revised Statutes of Missouri by adding a new section that mandates age verification for access to adult websites. It defines key terms related to age verification and outlines the responsibilities of commercial entities in verifying the age of individuals accessing sexual material, as well as penalties for non-compliance. The bill aims to protect minors from exposure to harmful sexual content online.
| Date | Action |
|---|---|
| 2026-01-07 | S First Read |
| 2025-12-05 | Prefiled |
Why Relevant: The bill explicitly mandates age verification for access to specific online services, a core pillar of the EU proposal's child protection framework.
Mechanism of Influence: Commercial entities must implement digital ID or third-party verification systems to gate access, effectively requiring a 'reliable identification' of users.
Evidence:
Ambiguity Notes: The term 'commercially reasonable method' for verifying age using transactional data is not strictly defined, leaving technical implementation details to the provider's discretion.
Why Relevant: The bill addresses data retention and privacy, which is a critical component of the compliance infrastructure for online safety mandates.
Mechanism of Influence: It prohibits the retention of identifying information used during the age verification process, creating a specific compliance duty for platforms and third-party verifiers.
Evidence:
Ambiguity Notes: The bill does not define 'identifying information,' which could lead to uncertainty regarding whether anonymized or hashed data can be retained for audit purposes.
Why Relevant: The bill establishes a compliance and enforcement framework involving state-level oversight and significant financial penalties.
Mechanism of Influence: The Attorney General is empowered to bring civil actions, with penalties reaching $10,000 per day for non-compliance with age verification requirements.
Evidence:
Ambiguity Notes: The threshold of 'more than one-third' of content being harmful to minors may be difficult for platforms to calculate precisely without automated scanning or manual audits.
Legislation ID: 235171
Bill URL: View Bill
This bill amends chapter 1 of RSMo by introducing the Guidelines for User Age-Verification and Responsible Dialogue Act of 2026 (GUARD Act). It mandates that artificial intelligence chatbots implement effective age verification processes to restrict access for minors. The bill outlines definitions, requirements for covered entities, penalties for violations, and establishes the role of the attorney general in enforcing compliance.
| Date | Action |
|---|---|
| 2026-01-07 | S First Read |
| 2025-12-19 | Prefiled |
Why Relevant: The bill mandates a robust age-verification and gatekeeping infrastructure for AI-driven interpersonal communication services.
Mechanism of Influence: Section 1.2058.5 requires providers to freeze all existing accounts and block new ones until age is verified via 'reasonable age verification measures,' which include government ID. This mirrors the EU CSA Act's emphasis on age-verification as a mitigation tool.
Evidence:
Ambiguity Notes: The term 'commercially reasonable method' for age verification is not strictly defined beyond the inclusion of government ID, leaving room for the Attorney General to define acceptable technologies via rulemaking.
Why Relevant: The bill imposes a 'safety-by-design' obligation regarding the solicitation of minors.
Mechanism of Influence: Section 1.2058.3 creates legal liability for the 'design' or 'development' of AI that 'poses a risk' of soliciting minors. This effectively compels providers to implement internal risk assessments and mitigation features in their AI models to prevent prohibited outputs.
Evidence:
Ambiguity Notes: The phrase 'poses a risk of soliciting' is broad and could be interpreted to require proactive monitoring or filtering of AI-generated content to ensure compliance.
Why Relevant: It establishes a compliance and enforcement framework with significant penalties for platform failures.
Mechanism of Influence: The bill grants the Attorney General investigative powers (subpoenas) and the authority to seek civil penalties of up to $100,000 per violation, creating a high-stakes compliance environment similar to the EU's proposed oversight authorities.
Evidence:
Ambiguity Notes: The bill does not specify if 'each violation' refers to each user account or each instance of a chatbot's output, potentially leading to massive cumulative fines.
Legislation ID: 250995
Bill URL: View Bill
This bill introduces a new section to Chapter 537 of the Revised Statutes of Missouri, focusing on the creation and distribution of altered sexual depictions of identifiable persons. It defines key terms, outlines offenses related to generating or promoting such depictions without consent, and establishes civil and criminal penalties for violations. Additionally, it mandates covered platforms to implement processes for reporting and removing nonconsensual altered sexual depictions.
| Date | Action |
|---|---|
| 2026-01-07 | S First Read |
| 2026-01-06 | Prefiled |
Why Relevant: The bill mandates a specific removal and blocking window for reported content.
Mechanism of Influence: Section 7(4)(a) requires platforms to remove content 'not later than forty-eight hours after receiving such request,' creating a strict compliance window similar to the EU's removal order framework.
Evidence:
Ambiguity Notes: The term 'as soon as practicable' alongside the 48-hour limit suggests a high priority for removal that may override standard internal review processes.
Why Relevant: The bill requires platforms to detect and remove identical copies of reported material.
Mechanism of Influence: Section 7(4)(b) compels platforms to use 'reasonable efforts to identify and remove any known identical copies,' which practically necessitates the use of hash-matching or similar detection technologies.
Evidence:
Ambiguity Notes: The phrase 'reasonable efforts to identify' is not defined, leaving open whether this requires proactive scanning of all user uploads against a database of reported content.
Why Relevant: The bill establishes a specific reporting and notification infrastructure for users.
Mechanism of Influence: Section 7(1) requires platforms to 'establish a process' for notification and removal requests, including specific metadata and identification requirements.
Evidence:
Ambiguity Notes: While this is a notice-and-takedown system rather than a centralized hub, it creates a statutory compliance infrastructure for content moderation.
Legislation ID: 234617
Bill URL: View Bill
Senate Bill No. 901 introduces a new section to chapter 407 of the Revised Statutes of Missouri, mandating age verification for individuals accessing adult websites. The bill defines key terms related to age verification, outlines the responsibilities of commercial entities, and sets penalties for non-compliance. It emphasizes the importance of not retaining identifying information and provides specific notices that must be displayed on websites. The bill also delineates exceptions for news organizations and clarifies the role of internet service providers in relation to these regulations.
| Date | Action |
|---|---|
| 2026-01-08 | Second Read and Referred S General Laws Committee |
| 2026-01-07 | S First Read |
| 2025-12-01 | Prefiled |
Why Relevant: The bill implements a mandatory age verification/assessment regime, which is a core pillar of the EU Chat Act's child protection framework.
Mechanism of Influence: It compels commercial entities to 'use reasonable age verification methods' (Section 407.3405.2) to gate access to specific content, potentially requiring the use of third-party verification infrastructure.
Evidence:
Ambiguity Notes: The term 'commercially reasonable method' (Section 407.3405.1(1)(b)b) is broad and could encompass various technologies with differing privacy implications.
Why Relevant: The bill addresses data preservation and internal controls, though it takes an opposite approach to the EU Chat Act by strictly prohibiting data retention.
Mechanism of Influence: It creates a statutory duty for entities and third-party verifiers to 'not retain any identifying information of the individual' (Section 407.3405.3), backed by significant per-instance fines.
Evidence:
Ambiguity Notes: The bill does not define 'identifying information,' leaving it unclear if anonymized or hashed data used for verification would be permitted for retention.
Why Relevant: The bill defines the scope of covered entities, including social media platforms, which overlaps with the 'hosting services' and 'interpersonal communication services' targeted by the EU Chat Act.
Mechanism of Influence: It applies to any internet website, 'including a social media platform' (Section 407.3405.2), provided the content threshold is met, while exempting infrastructure providers like ISPs and cloud services.
Evidence:
Ambiguity Notes: The 'more than one-third' content threshold (Section 407.3405.2) may require platforms to perform ongoing internal audits or risk assessments of their content distribution to determine if they fall under the mandate.
Legislation ID: 263260
Bill URL: View Bill
The Transparency in Artificial Intelligence Risk Management Act seeks to regulate large frontier developers and chatbot providers by mandating the creation and publication of public safety and child protection plans. These plans must detail risk assessments and mitigation strategies for catastrophic risks and child safety incidents related to AI technologies. The bill outlines specific definitions, reporting requirements, and compliance measures to ensure accountability and transparency in the deployment of AI systems.
| Date | Action |
|---|---|
| 2026-01-15 | Date of introduction |
Why Relevant: The bill mandates ongoing risk assessments and mitigation plans for providers of services likely to be accessed by minors.
Mechanism of Influence: Large chatbot providers must document how they assess potential child safety risks and apply mitigations to address those risks, mirroring the EU framework's duty to identify and mitigate risks to minors.
Evidence:
Ambiguity Notes: The definition of 'child safety risk' includes damage to mental health constituting 'severe emotional distress,' a broad standard that could be interpreted to require extensive moderation or product feature changes.
Why Relevant: The act establishes a centralized reporting requirement for safety incidents involving minors.
Mechanism of Influence: Providers are legally compelled to report 'child safety incidents' to the Attorney General within fifteen days of discovery, creating a state-level clearinghouse for incident data similar to the EU's reporting obligations.
Evidence:
Ambiguity Notes: The reporting mechanism is established by the Attorney General, and while it requires a 'short and plain statement,' the specific metadata or user information required is not yet defined.
Why Relevant: The bill includes data preservation and internal compliance infrastructure requirements.
Mechanism of Influence: Providers must retain unredacted versions of safety documents for five years and implement internal governance practices to ensure the safety plan is followed, including whistleblower protections.
Evidence:
Ambiguity Notes: The retention requirement applies to 'unredacted information' in documents published to comply with the act, which may include sensitive internal assessments of model behavior.
Legislation ID: 122219
Bill URL: View Bill
The Artificial Intelligence Consumer Protection Act is designed to protect consumers from algorithmic discrimination by setting forth requirements for developers and deployers of high-risk artificial intelligence systems. It outlines definitions, responsibilities, and documentation requirements to ensure compliance with anti-discrimination laws. The act mandates developers to disclose known risks and implement risk management policies, while deployers must conduct impact assessments and use reasonable care in their deployment of such systems.
| Date | Action |
|---|---|
| 2026-01-07 | Title printed. Carryover bill |
| 2025-01-28 | Notice of hearing for February 06, 2025 |
| 2025-01-24 | Referred to Judiciary Committee |
| 2025-01-22 | Date of introduction |
Why Relevant: The bill mandates provider risk assessments and mitigation plans for high-risk systems.
Mechanism of Influence: Section 4(2) requires deployers to 'implement a risk management policy and program,' and Section 4(3) requires an 'impact assessment' for each high-risk AI system deployed. This mirrors the EU Chat Act's requirement for ongoing risk assessments.
Evidence:
Ambiguity Notes: While the administrative structure is similar, the 'risk' being assessed is 'algorithmic discrimination' rather than child safety or illegal content.
Why Relevant: The bill includes data preservation and internal control requirements.
Mechanism of Influence: Section 4(3)(f) requires deployers to maintain impact assessments and related records for at least three years after the final deployment of a system.
Evidence:
Ambiguity Notes: None
Why Relevant: The bill establishes a compliance infrastructure and consumer redress mechanism.
Mechanism of Influence: Section 4(4)(b)(iii) requires deployers to provide consumers an 'opportunity to appeal any adverse consequential decision,' which must allow for 'human review if technically feasible.'
Evidence:
Ambiguity Notes: None
The Saving Human Connection Act establishes regulations for covered platforms that operate generative artificial intelligence systems. It defines key terms, outlines responsibilities for platforms to protect users, especially minors, and mandates transparency regarding the non-human nature of chatbots. The act also provides for enforcement mechanisms and civil penalties for violations.
| Date | Action |
|---|---|
| 2026-01-13 | Referred to Banking, Commerce and Insurance Committee |
| 2026-01-09 | Date of introduction |
| 2026-01-09 | KauthFA563filed |
| 2026-01-09 | MurmanFA564filed |
| 2026-01-09 | MurmanFA565filed |
Why Relevant: The bill mandates the implementation of age verification systems for access to specific platform features.
Mechanism of Influence: Covered platforms must gate 'human-like features' behind 'reasonable age verification systems' (Sec. 3(2)(b)) to ensure they are 'not made available to minors.'
Evidence:
Ambiguity Notes: The term 'reasonable age verification' is not defined, leaving open whether this requires government ID, biometrics, or third-party verification services.
Why Relevant: The bill compels platforms to operate automated detection and monitoring technologies to scan user interactions for specific content.
Mechanism of Influence: Platforms are legally required to 'detect' and 'report' emergency situations (Sec. 3(2)(e)) and 'detect and prevent emotional dependence' (Sec. 3(2)(f)). This necessitates continuous monitoring of the text, audio, or visual medium of the chatbot interaction.
Evidence:
Ambiguity Notes: While the target is 'emergency situations' rather than CSAM, the mechanism—mandated automated detection within private or semi-private AI interactions—parallels the EU's detection order framework.
Why Relevant: The bill establishes a mandatory reporting and mitigation framework for detected content.
Mechanism of Influence: Once an 'emergency situation' is detected, the platform has a statutory duty to 'report' and 'mitigate' the situation (Sec. 3(2)(e)), prioritizing safety over other interests.
Evidence:
Ambiguity Notes: The bill does not specify the recipient of the 'report,' though Sec. 4(1) grants the Attorney General enforcement power, suggesting a state-level reporting nexus.
Why Relevant: The bill mandates specific product design changes and 'mitigation' plans to address perceived risks of the technology.
Mechanism of Influence: Platforms must provide a 'default version' without human-like features (Sec. 3(2)(c)) and 'mitigate' risks of emotional dependence (Sec. 3(2)(f)), effectively requiring safety-by-design modifications.
Evidence:
Ambiguity Notes: The requirement to 'prevent emotional dependence' may require platforms to throttle usage or alter AI personality traits based on individual user monitoring.
Legislation ID: 235925
Bill URL: View Bill
This bill introduces the Age-Appropriate Design Code Act, which sets forth definitions, exclusions, duties of care, and privacy settings specifically aimed at protecting minors personal data. It outlines the responsibilities of covered businesses regarding the processing of minors data and establishes stringent requirements for privacy settings to safeguard minors online experiences.
| Date | Action |
|---|---|
| 2026-01-07 | To Be Introduced 01/07/2026 and referred to Commerce and Consumer Affairs |
Why Relevant: The bill mandates age verification/assurance mechanisms for online services likely to be accessed by minors.
Mechanism of Influence: It requires the Attorney General to identify 'commercially reasonable and technically feasible methods' for businesses to determine a user's age, creating a statutory mandate for age-gating technologies.
Evidence:
Ambiguity Notes: The term 'technically feasible methods' is not defined, potentially allowing for invasive identity verification or biometric analysis to satisfy the requirement.
Why Relevant: The bill imposes risk-mitigation duties regarding product design and algorithmic recommendation systems.
Mechanism of Influence: Businesses must ensure their design does not result in 'foreseeable emotional distress' or 'compulsive use,' effectively mandating changes to moderation and product features to protect minors.
Evidence:
Ambiguity Notes: The 'duty of care' is broad; 'emotional distress' and 'compulsive use' are subjective standards that may compel platforms to implement aggressive content filtering or feature removal to avoid liability.
Why Relevant: The bill restricts interpersonal communication between adults and minors, which has implications for encrypted messaging.
Mechanism of Influence: By prohibiting direct messaging between minors and 'known adults' by default, platforms may be forced to verify the age and identity of all participants in a conversation, potentially requiring metadata analysis or client-side checks in E2EE environments.
Evidence:
Ambiguity Notes: The requirement to 'not permit direct messaging' unless expressly allowed requires the platform to know the age status of both parties, which is difficult to achieve without undermining anonymity or encryption.
Legislation ID: 235933
Bill URL: View Bill
This bill introduces new provisions under a chapter titled App Store Accountability to govern how app store providers must handle age verification and parental consent. It defines various age categories and outlines the responsibilities of app store providers and developers to ensure compliance with age-related restrictions and to protect the personal data of minors. The bill also establishes enforcement mechanisms and potential penalties for violations.
| Date | Action |
|---|---|
| 2026-01-07 | To Be Introduced 01/07/2026 and referred to Commerce and Consumer AffairsHJ 1 |
Why Relevant: The bill mandates age verification and age assessment for all users of digital application platforms.
Mechanism of Influence: It requires app store providers to 'Verify the individuals age category' at the time of account creation using 'commercially available methods' (359-V:2, I(a)).
Evidence:
Ambiguity Notes: The term 'commercially available methods' is not defined, potentially encompassing high-friction identity verification or biometric age estimation.
Why Relevant: The bill imposes significant gatekeeping obligations on app store providers to control minor access.
Mechanism of Influence: App stores must 'Obtain verifiable parental consent' before allowing a minor to 'Download an app' or 'Purchase an app' (359-V:2, I(b)(2)).
Evidence:
Ambiguity Notes: This creates a centralized enforcement point where the app store is responsible for the compliance of all third-party apps.
Why Relevant: The bill establishes compliance infrastructure and data-sharing requirements between platforms and developers.
Mechanism of Influence: App store providers must provide 'Age category data' and 'status of verified parental consent' to developers upon request (359-V:2, I(d)).
Evidence:
Ambiguity Notes: While the bill requires 'industry-standard encryption' for this data, it mandates the creation and transmission of age-related metadata across the app ecosystem.
Legislation ID: 113555
Bill URL: View Bill
This bill mandates that manufacturers of electronic tablets and smartphones install filters that restrict minors from accessing obscene content online. It establishes civil and criminal liabilities for manufacturers and individuals who intentionally disable these filters to allow minors access to such material. The bill also provides a private right of action for parents or guardians if a minor accesses obscene material due to a violation of these provisions.
| Date | Action |
|---|---|
| 2026-01-07 | Refer for Interim Study: MA VV 01/07/2026HJ 1 |
| 2025-11-12 | Committee Report: Refer for Interim Study 11/12/2025 (Vote 17-0; CC)HC 51P. 14 |
| 2025-11-12 | Executive Session: 11/12/2025 10:00 am GP 230 |
| 2025-10-27 | ==CANCELLED== Subcommittee Work Session: 10/27/2025 11:00 am GP 230 |
| 2025-09-15 | Subcommittee Work Session: 09/15/2025 10:00 am GP 230 |
| 2025-09-10 | Full Committee Work Session: 09/10/2025 10:00 am GP 230 |
| 2025-03-03 | Executive Session: 03/03/2025 10:00 am LOB 206-208 |
| 2025-03-03 | Retained in Committee |
Why Relevant: The bill mandates age verification/assessment at the device activation level.
Mechanism of Influence: Manufacturers must 'Ask the user to provide the user’s age during activation' and 'Automatically enable the filter when the user is a minor' (RSA 507-I:2, II-III).
Evidence:
Ambiguity Notes: The bill does not specify the method of verification, only that the age is 'provided by the user,' which may lead to self-certification or more intrusive verification requirements to ensure accuracy.
Why Relevant: The bill compels the installation of detection and blocking technologies on devices.
Mechanism of Influence: Devices must contain a 'filter' defined as software 'capable of preventing the device from accessing or displaying obscene material' (RSA 507-I:1, III).
Evidence:
Ambiguity Notes: The term 'displaying' could be interpreted to require client-side scanning of content before it is rendered on the screen, potentially affecting encrypted content if accessed via manufacturer-controlled browsers.
Why Relevant: The bill establishes a compliance and liability infrastructure for manufacturers.
Mechanism of Influence: Manufacturers are subject to the Consumer Protection Act and private lawsuits for failure to comply with filtering mandates (RSA 507-I:3, RSA 507-I:4).
Evidence:
Ambiguity Notes: None
Legislation ID: 251380
Bill URL: View Bill
SB 648 mandates that websites and applications that distribute material harmful to minors must verify the age of users before allowing access to such content. It establishes clear definitions of harmful material, outlines the responsibilities of commercial entities regarding age verification, and creates enforcement mechanisms including a private right of action for parents and oversight by the attorney general.
| Date | Action |
|---|---|
| 2026-01-08 | Hearing: 01/08/2026, Room 100, SH, 01:30 pm;SC 47 |
| 2026-01-07 | Introduced 01/07/2026 and Referred to Judiciary;SJ 1 |
Why Relevant: The bill mandates age verification for access to specific categories of online content.
Mechanism of Influence: It requires commercial entities to 'Require all users attempting to access such material to verify their age using a reasonable age verification method' (507-J:2, I, a).
Evidence:
Ambiguity Notes: The term 'reasonable age verification method' includes 'industry best practices' (507-J:1, IV, b), which may evolve to include more intrusive technologies over time.
Why Relevant: The bill contains strong data privacy protections that contrast with the data preservation requirements of the EU CSA Regulation.
Mechanism of Influence: It explicitly prohibits the retention of verification data, which prevents the creation of the types of databases or audit trails often required for compliance in more comprehensive safety regimes.
Evidence:
Ambiguity Notes: The prohibition on data retention (507-J:5) acts as a safeguard against the surveillance-heavy aspects of the EU model.
Legislation ID: 256969
Bill URL: View Bill
This legislation amends New Jerseys obscenity laws to require sexually oriented online entities to verify the age of individuals attempting to access obscene material, ensuring that only those 18 years or older can view such content. It outlines definitions of obscene material, establishes penalties for violations, and provides methods for age verification, including using third-party services or state motor vehicle identification systems.
| Date | Action |
|---|---|
| 2026-01-13 | Introduced in the Senate, Referred to Senate Judiciary Committee |
Why Relevant: The bill mandates age verification for online entities, a core component of the EU Chat Act's framework for protecting minors.
Mechanism of Influence: It requires platforms to 'verify that each individual that attempts to access or view the obscene material is at least 18 years of age' (Section 1.f.1), potentially requiring the integration of third-party ID services or state databases.
Evidence:
Ambiguity Notes: The definition of 'substantial portion' (Section 1.a.10) as one-third of revenue or content creates a specific threshold that may exclude general-purpose platforms while capturing niche social media or file-sharing sites.
Why Relevant: The bill includes requirements for user redress and contact points, similar to the compliance infrastructure in the EU proposal.
Mechanism of Influence: Entities must provide a 'means by which an individual at least 18 years of age who is wrongly denied access may contact the entity to correct the individual’s access restriction' (Section 1.f.3).
Evidence:
Ambiguity Notes: While it requires a contact point for access issues, it does not mandate the broader 'legal representative' or 'compliance officer' roles seen in more comprehensive regulations.
Legislation ID: 254680
Bill URL: View Bill
The Human Trafficking and Child Exploitation Prevention Act establishes regulations for products that provide internet access, mandating them to include digital blocking features that prevent minors from accessing obscene materials. The bill outlines the responsibilities of manufacturers and sellers, including maintaining the functionality of these blocking features and providing a reporting mechanism for users. It also details the process for deactivating these features, penalties for non-compliance, and the allocation of fees collected for supporting anti-trafficking initiatives.
| Date | Action |
|---|---|
| 2026-01-13 | Introduced in the Senate, Referred to Senate Law and Public Safety Committee |
Why Relevant: The bill mandates the installation and operation of filtering technologies to detect and block specific categories of content.
Mechanism of Influence: By requiring products to 'render any obscene material... inaccessible' and 'ensure that all child pornography... is inaccessible,' the law compels the use of scanning or filtering software at the device or application level.
Evidence:
Ambiguity Notes: The term 'digital blocking capability' is not technically defined, leaving open whether this must occur via DNS filtering, client-side scanning, or OS-level content analysis.
Why Relevant: The bill imposes a strict age-verification requirement for users wishing to bypass the content filters.
Mechanism of Influence: Users must 'present identification to verify that the consumer is 18 years of age or older' to deactivate the blocking features, creating a state-mandated identity-linked access control for unfiltered internet.
Evidence:
Ambiguity Notes: The bill does not specify the standards for 'identification,' potentially requiring government-issued IDs for internet de-filtering.
Why Relevant: The bill requires ongoing maintenance and mitigation efforts to ensure the effectiveness of the blocking technology.
Mechanism of Influence: Manufacturers are legally obligated to perform 'ongoing efforts' to ensure the filter 'functions properly,' which mirrors the EU's requirement for ongoing risk mitigation and technology updates.
Evidence:
Ambiguity Notes: The phrase 'reasonable and ongoing efforts' is broad and could be interpreted as a mandate for continuous updates to hash databases or AI-based detection models.
Why Relevant: The mandate functionally compels client-side scanning, which undermines end-to-end encryption (E2EE).
Mechanism of Influence: Because the 'product' (which can include software or devices) must ensure content is 'inaccessible,' it necessitates analyzing content before or after decryption on the device, effectively bypassing E2EE protections for private communications.
Evidence:
Ambiguity Notes: While encryption is not named, the requirement to block content 'on the product' makes it impossible to honor E2EE if the product is a communication device or app.
Legislation ID: 49633
Bill URL: View Bill
This bill prohibits social media platforms from promoting behaviors that could lead to eating disorders among users under 18 years old. It defines key terms related to social media and eating disorders and establishes requirements for platforms to audit their practices. Violations can result in substantial civil penalties.
| Date | Action |
|---|---|
| 2026-01-12 | Substituted by A4664 (1R) |
| 2026-01-08 | Reported from Senate Committee, 2nd Reading |
| 2026-01-06 | Transferred to Senate Budget and Appropriations Committee |
| 2025-02-25 | Introduced in the Senate, Referred to Senate Health, Human Services and Senior Citizens Committee |
Why Relevant: The bill mandates a systematic risk-assessment and mitigation regime for social media platforms regarding child safety.
Mechanism of Influence: It requires platforms to conduct 'quarterly audits' and 'annual audit[s]' of their 'practices, algorithms, designs, features, and affordances' to identify potential harms (Section 2(b)(1)(a)).
Evidence:
Ambiguity Notes: The term 'action to correct' is not defined, leaving it to the platform's discretion or future regulatory interpretation as to whether this requires changing recommendation engines or removing specific content types.
Why Relevant: The bill compels platforms to implement mitigation plans based on audit findings, mirroring the EU's mandated mitigation measures.
Mechanism of Influence: If an audit identifies a risk, the platform must 'take action to correct the practice, algorithm, design, feature, or affordance within 30 calendar days' (Section 2(b)(1)(b)).
Evidence:
Ambiguity Notes: The 30-day window is a strict compliance timeline similar to the EU's rapid response requirements for identified risks.
Why Relevant: The bill establishes a compliance infrastructure and auditing requirement to avoid significant civil penalties.
Mechanism of Influence: It creates a legal defense for platforms that maintain an 'internal audit program,' effectively making these assessments a mandatory part of platform governance (Section 2(b)(1)).
Evidence:
Ambiguity Notes: While the bill includes a Section 230 and First Amendment savings clause, the requirement to audit and 'correct' algorithms may create friction with existing federal protections for platform editorial discretion.
Legislation ID: 53999
Bill URL: View Bill
This bill amends the General Business Law to introduce regulations concerning the liability of chatbot proprietors for misleading or harmful information provided by their systems. It defines key terms related to artificial intelligence and chatbots, outlines the responsibilities of chatbot proprietors, and establishes requirements for the protection of users, particularly minors and those at risk of self-harm. The bill mandates clear disclosures to users regarding their interactions with chatbots and sets forth penalties for non-compliance.
| Date | Action |
|---|---|
| 2026-01-07 | referred to consumer affairs and protection |
| 2025-02-28 | amend and recommit to consumer affairs and protection |
| 2025-02-28 | print number 222a |
| 2025-01-08 | referred to consumer affairs and protection |
Why Relevant: The bill mandates age verification for users of companion chatbots.
Mechanism of Influence: Proprietors must use 'commercially reasonable and technically feasible methods' to determine if a user is a minor and obtain parental consent before allowing access.
Evidence:
Ambiguity Notes: The bill leaves the specific 'methods' and 'appropriate levels of accuracy' to be defined by Attorney General regulations, similar to the EU's reliance on 'reliable technologies.'
Why Relevant: The bill compels the detection of specific user content (self-harm) within interpersonal-style communications.
Mechanism of Influence: Proprietors are legally required to 'determine whether a covered user is expressing thoughts of self-harm,' necessitating the scanning of user inputs to identify prohibited indicators.
Evidence:
Ambiguity Notes: While focused on self-harm rather than CSAM, the mechanism of mandated content detection in 'interpersonal' simulations mirrors the EU Chat Act's scanning requirements.
Why Relevant: The bill requires ongoing risk and vulnerability assessments.
Mechanism of Influence: Proprietors must implement methods to discover system vulnerabilities, including those related to age determination and content detection, creating a continuous compliance and mitigation loop.
Evidence:
Ambiguity Notes: The term 'vulnerabilities' is broad and could encompass both security flaws and failures in the mandated moderation/detection systems.
Why Relevant: The bill mandates access-blocking/removal orders based on detected content.
Mechanism of Influence: If self-harm thoughts are detected, the proprietor is statutorily required to prohibit the user's continued use of the service for a set period (24 hours to 3 days).
Evidence:
Ambiguity Notes: This functions as a mandatory temporary 'blocking order' triggered by automated or human-reviewed detection.
Legislation ID: 57911
Bill URL: View Bill
This legislation, known as the New York artificial intelligence bill of rights, is designed to protect New York residents from the potential harms of automated decision-making systems. It outlines specific rights related to safety, discrimination, data privacy, and the ability to opt for human alternatives in interactions with automated systems. The bill emphasizes the importance of oversight and accountability in the development and deployment of such technologies.
| Date | Action |
|---|---|
| 2026-01-07 | referred to science and technology |
| 2025-01-27 | referred to science and technology |
Why Relevant: The bill mandates risk assessments and mitigation plans for automated systems, a core structural feature of the EU Chat Act.
Mechanism of Influence: Developers must perform 'pre-deployment testing, risk identification and mitigation' and 'ongoing monitoring' to ensure systems are safe and effective (§ 504(2)).
Evidence:
Ambiguity Notes: While the mechanism is similar, the scope is limited to 'civil rights, civil liberties, and privacy' and 'access to critical resources' (§ 502) rather than content-specific scanning.
Legislation ID: 58033
Bill URL: View Bill
This bill amends the general business law to create standards for internet dating services regarding the verification of users identities, locations, and ages. It mandates that services verify members identities and locations before allowing them to register, particularly focusing on preventing minors from accessing such services. The bill also outlines penalties for non-compliance and the responsibilities of internet dating services to implement security measures.
| Date | Action |
|---|---|
| 2026-01-07 | referred to consumer affairs and protection |
| 2025-01-27 | referred to consumer affairs and protection |
Why Relevant: The bill mandates strict age verification to exclude minors from the service, a core element of the EU CSA Act's safety framework.
Mechanism of Influence: It creates a statutory duty to use technology to verify government IDs ('license verification') to identify and block minors, effectively gatekeeping the service based on age.
Evidence:
Ambiguity Notes: The term 'license verification' is defined broadly as 'the use of technology,' which could encompass various automated third-party ID verification systems with varying privacy implications.
Why Relevant: The bill requires identity verification that links digital accounts to physical, government-verified identities.
Mechanism of Influence: By requiring 'on-demand self-photographs' matched to government IDs, the bill establishes a 'know your user' compliance infrastructure that mirrors the identity-linked safety requirements often bundled with online safety regulations.
Evidence:
Ambiguity Notes: The requirement for 'on-demand self-photographs' suggests a biometric or liveness-check component, though the bill does not explicitly define the technical standards for this verification.
Why Relevant: The bill includes enforcement mechanisms and penalties for failing to verify users, creating a high-stakes compliance environment.
Mechanism of Influence: The Attorney General is empowered to seek damages per unverified member, compelling platforms to adopt verification technologies to avoid significant financial liability.
Evidence:
Ambiguity Notes: The 'pattern and practice' clause allows for tripled damages, which may lead platforms to over-verify or exclude users to minimize legal risk.
Legislation ID: 58060
Bill URL: View Bill
This legislation seeks to amend the general business law by establishing regulations for interactive computer service providers in New York. It defines key terms related to content promotion and outlines the responsibilities and liabilities of these providers when targeting minors with potentially harmful content. The bill emphasizes the need for service providers to avoid knowingly or negligently promoting content that could cause physical or emotional harm to minors.
| Date | Action |
|---|---|
| 2026-01-07 | referred to consumer affairs and protection |
| 2025-01-27 | referred to consumer affairs and protection |
Why Relevant: The bill creates a de facto age verification mandate by removing legal defenses for misidentifying a user's age or data.
Mechanism of Influence: Section 3 creates a strict liability environment where providers cannot claim they didn't know a user was a minor. This compels the implementation of 'reliable' age verification technologies to avoid the $100,000 civil penalty per offense.
Evidence:
Ambiguity Notes: The bill does not specify which 'technologies' are acceptable for data determination, leaving providers to choose potentially intrusive methods to ensure 100% accuracy.
Why Relevant: The bill mandates a form of risk mitigation regarding algorithmic 'promotion' and 'targeting' of content.
Mechanism of Influence: By holding providers liable for 'negligently' promoting injurious content, the law forces providers to modify their recommender systems and moderation features to prevent 'dangerous' content from reaching minors, similar to the EU's mitigation requirements.
Evidence:
Ambiguity Notes: The term 'negligently' implies a duty of care that is not fully defined, likely requiring providers to perform internal risk assessments to prove they were not negligent.
Why Relevant: The bill's definition of 'targeting' involves extensive processing of sensitive personal data, impacting user privacy.
Mechanism of Influence: To comply with the prohibition on targeting minors based on 'psychological profile' or 'medical condition,' providers must effectively scan or analyze user data to categorize them, mirroring the data-heavy compliance infrastructure of the EU Chat Act.
Evidence:
Ambiguity Notes: The broad list of personal data (including 'engaging in an affair' or 'mental state') suggests that providers must have deep insight into private user characteristics to ensure they are not 'targeting' based on these factors.
Legislation ID: 58101
Bill URL: View Bill
This bill establishes a framework for the oversight of high-risk advanced artificial intelligence systems by empowering a secretary to review, recommend, and enforce compliance measures. It outlines the responsibilities of operators regarding system modifications, incident reporting, and compliance with ethical standards, as well as the penalties for non-compliance. The bill also addresses issues related to source code management, third-party integrations, and security risks associated with AI systems.
| Date | Action |
|---|---|
| 2026-01-07 | referred to science and technology |
| 2025-01-27 | referred to science and technology |
Why Relevant: The bill mandates ongoing risk assessments and binding mitigation plans for high-risk systems.
Mechanism of Influence: The Secretary reviews systems for potential risk and issues binding recommendations that operators must implement via a detailed plan.
Evidence:
Ambiguity Notes: While similar in structure to the EU Chat Act's risk mitigation, the focus here is on AI safety and ethical conduct rather than child sexual abuse prevention.
Why Relevant: The bill requires mandatory reporting of system incidents to the state and law enforcement.
Mechanism of Influence: Licensees must report malfunctions that harm individuals, creating a centralized reporting pipeline for system failures.
Evidence:
Ambiguity Notes: This reporting is triggered by 'malfunctions' and 'harm' generally, not specifically by the detection of illegal content like CSAM.
Why Relevant: The bill imposes significant data preservation and logging requirements.
Mechanism of Influence: Every operation must generate a log, which must be preserved for ten years and made available for state inspection.
Evidence:
Ambiguity Notes: The scope of 'every operation' is broad and could include metadata or transaction details, though it is framed as system auditing rather than content monitoring.
Why Relevant: The bill provides for the compelled cessation of services for prohibited systems.
Mechanism of Influence: The Secretary can demand that an operator cease development or operation of an AI system deemed 'prohibited' or high-risk to national security.
Evidence:
Ambiguity Notes: This functions as a 'kill switch' or blocking order for the software itself rather than specific user-generated content.
Legislation ID: 59276
Bill URL: View Bill
This legislation introduces new regulations under the general business law, specifically requiring commercial entities that publish or distribute sexual material harmful to minors to implement reasonable age verification methods. It outlines definitions, procedures for verification, and penalties for noncompliance, while providing exceptions for bona fide news organizations and internet service providers.
| Date | Action |
|---|---|
| 2026-01-07 | referred to consumer affairs and protection |
| 2025-05-19 | held for consideration in consumer affairs and protection |
| 2025-01-30 | referred to consumer affairs and protection |
Why Relevant: The bill establishes a mandatory age-verification regime for access to specific categories of online content.
Mechanism of Influence: It compels commercial entities to 'use reasonable age verification methods' (§ 390-f(2)(a)) and 'perform age verification for each internet protocol address... at least once per day' (§ 390-f(2)(b)).
Evidence:
Ambiguity Notes: The term 'reasonable age verification method' includes 'commercially reasonable method that relies on public or private transactional data' (§ 390-f(3)(a)(iii)(2)), which is not strictly defined and could encompass various third-party data-scraping or identity-matching services.
Why Relevant: The bill specifies the types of identity and financial data that must be processed to gate access, mirroring the EU's focus on reliable identification.
Mechanism of Influence: It lists acceptable verification methods including 'digital identification,' 'credit card transaction,' or 'government-issued identification' (§ 390-f(3)(a)).
Evidence:
Ambiguity Notes: The requirement for 'digital identification' (§ 390-f(1)(b)) as 'information stored on a digital network' that serves as 'proof of the identity' could imply a move toward centralized digital ID systems.
Why Relevant: The bill excludes infrastructure providers, focusing the compliance burden on content publishers.
Mechanism of Influence: It explicitly protects ISPs, search engines, and cloud providers from liability if they are not responsible for the 'creation of such content' (§ 390-f(4)(b)).
Evidence:
Ambiguity Notes: While it protects ISPs, the definition of 'commercial entity' is broad enough to cover any business entity that 'distributes' material, potentially creating friction for platforms that host user-generated content if they are deemed to 'distribute' it.
Legislation ID: 62966
Bill URL: View Bill
This legislation introduces comprehensive measures to safeguard consumer data rights, promoting transparency, individual control over personal information, and the establishment of privacy policies that align with federal regulations. It mandates that covered entities implement reasonable practices to mitigate privacy risks, especially concerning minors, and provides individuals with rights to access and manage their data.
| Date | Action |
|---|---|
| 2026-01-07 | referred to science and technology |
| 2025-02-20 | referred to science and technology |
Why Relevant: The bill mandates that covered entities perform risk assessments and implement mitigation plans specifically for minors.
Mechanism of Influence: Entities must 'identify, assess, and mitigate privacy risks related to covered minors' and ensure that residual risks are 'proportionate.'
Evidence:
Ambiguity Notes: The term 'privacy risks' is broad and, while primarily focused on data protection, could be interpreted by regulators to include risks related to the exposure of minors to harmful content or solicitation.
Why Relevant: The bill requires large data holders to conduct ongoing impact assessments for algorithms that pose a risk of harm to individuals, including minors.
Mechanism of Influence: Large data holders must submit annual impact assessments to the state division, detailing design methodologies and mitigation steps for potential harms to 'covered minors.'
Evidence:
Ambiguity Notes: The 'consequential risk of harm' trigger for these assessments is not strictly defined, potentially allowing for oversight of recommender systems or moderation algorithms.
Why Relevant: The bill establishes a specific office to oversee youth-related data practices and requires the designation of points of contact for privacy inquiries.
Mechanism of Influence: Creates a 'Youth Privacy and Marketing Office' and mandates that privacy policies include 'points of contact' for inquiries.
Evidence:
Ambiguity Notes: This creates the administrative infrastructure for compliance and oversight, though its mandate is focused on privacy rather than content detection.
Legislation ID: 64455
Bill URL: View Bill
This bill amends the general business law to introduce the New York Childrens Online Safety Act, which includes provisions for privacy by default, parental approvals, and protections against manipulative design practices (dark patterns) on online platforms used by minors. It mandates age verification for users, default privacy settings for minors, and parental oversight for connections and financial transactions involving minors. The bill also outlines the authority of the attorney general to enforce these regulations and provides remedies for violations.
| Date | Action |
|---|---|
| 2026-01-12 | amend (t) and recommit to codes |
| 2026-01-12 | print number 6549a |
| 2026-01-07 | referred to codes |
| 2025-05-27 | reported referred to codes |
| 2025-03-06 | referred to consumer affairs and protection |
Why Relevant: The bill mandates age verification for all users to identify minors.
Mechanism of Influence: Section 1510(1) states that no operator shall offer a platform 'without conducting commercially reasonable age verification to determine whether a user is a covered minor.' This aligns with the EU Chat Act's emphasis on age-gating and identifying child users.
Evidence:
Ambiguity Notes: The term 'commercially reasonable' is not defined in the text, leaving the specific technology (e.g., ID upload, facial analysis) to be determined by Attorney General rulemaking.
Why Relevant: The bill targets platforms where private messaging is a core feature.
Mechanism of Influence: The definition of a 'covered platform' in Section 1509(6)(d) specifically includes services that allow users to 'privately message each other as a significant part' of the service. This brings interpersonal messaging services—a primary target of the EU Chat Act—under the scope of the law.
Evidence:
Ambiguity Notes: What constitutes a 'significant part' of a service is subjective and could lead to broad application across various social and communication apps.
Why Relevant: The bill imposes 'Privacy by Default' requirements that restrict private communication.
Mechanism of Influence: Section 1510(2)(a) requires that, by default, no user who is not 'connected' to a minor may 'communicate directly and privately with such minor.' This creates a statutory barrier to private messaging that platforms must enforce through technical means.
Evidence:
Ambiguity Notes: While the bill does not explicitly mandate scanning content, enforcing 'connections' and blocking 'unconnected' messages may require platforms to identify the status of all participants in a private chat, potentially impacting the anonymity of encrypted services.
Legislation ID: 166829
Bill URL: View Bill
The New York Artificial Intelligence Act aims to regulate AI systems that significantly impact individuals rights and opportunities. It addresses algorithmic discrimination, mandates developer and deployer responsibilities, and introduces auditing and reporting requirements for high-risk AI systems. The Act emphasizes the need for transparency, oversight, and the protection of vulnerable populations from potential harms associated with AI technologies.
| Date | Action |
|---|---|
| 2026-01-12 | reference changed to science and technology |
| 2026-01-07 | referred to ways and means |
| 2025-06-11 | reference changed to ways and means |
| 2025-06-09 | referred to science and technology |
Why Relevant: Provider risk assessments & mitigation plans
Mechanism of Influence: Mandates that developers and deployers implement a 'risk management policy and program' to identify and mitigate foreseeable risks.
Evidence:
Ambiguity Notes: The definition of 'high-risk' is broad, covering any system that is a 'substantial factor' in a consequential decision or has a 'material impact' on rights.
Why Relevant: Reporting and compliance infrastructure
Mechanism of Influence: Requires annual or biennial third-party audits and the filing of detailed impact assessment reports with the Attorney General.
Evidence:
Ambiguity Notes: Reporting focuses on training data and system architecture rather than user content or metadata.
Why Relevant: Internal controls and redress
Mechanism of Influence: Establishes a mandatory appeal process for automated decisions, requiring human intervention and review.
Evidence:
Ambiguity Notes: None
Legislation ID: 167224
Bill URL: View Bill
This bill amends the general business law by introducing a new article that mandates covered manufacturers of internet-enabled devices to implement age assurance measures. It defines key terms related to age assurance, establishes requirements for determining whether a user is a minor, and outlines the nondiscrimination obligations of manufacturers. Additionally, it grants the attorney general rulemaking authority and sets forth enforcement mechanisms.
| Date | Action |
|---|---|
| 2026-01-07 | referred to consumer affairs and protection |
| 2025-06-10 | referred to consumer affairs and protection |
Why Relevant: The bill mandates systemic age verification/assessment at the hardware and OS level.
Mechanism of Influence: It shifts the burden of age identification from individual service providers to the device and OS infrastructure, requiring 'commercially reasonable' and 'technically feasible' methods to 'reliably identify child users.'
Evidence:
Ambiguity Notes: The bill leaves the specific 'methods' and 'appropriate levels of accuracy' to the Attorney General's rulemaking, which could potentially include biometric or identity-document-based verification.
Why Relevant: The bill imposes specific compliance and gatekeeping obligations on application stores and OS providers.
Mechanism of Influence: App stores and OS providers are categorized as 'covered manufacturers' and must integrate age-signaling features into their updates 'by default,' creating a centralized compliance infrastructure.
Evidence:
Ambiguity Notes: By requiring the signal to be provided to 'all websites, online services, online applications,' it creates a universal tracking signal for minor status across all software.
Why Relevant: The bill mandates a technical 'signal' or API for age status, which functions as a form of centralized reporting/routing of user status.
Mechanism of Influence: The requirement for a 'real-time application programming interface (API)' to signal minor status to third parties replicates the EU's focus on technical interoperability for age-gating.
Evidence:
Ambiguity Notes: While the bill mandates data deletion after the initial check, the persistent 'digital signal' effectively labels the user across all internet activity on that device.
Legislation ID: 242277
Bill URL: View Bill
This bill amends the General Business Law by adding a new article focused on protecting minors online. It defines key terms, outlines the responsibilities of social media platforms regarding minor account holders, establishes requirements for age verification, and sets penalties for non-compliance. The legislation seeks to ensure that minors are not exposed to harmful content and that their personal information is adequately protected.
| Date | Action |
|---|---|
| 2026-01-07 | referred to consumer affairs and protection |
| 2025-12-19 | referred to consumer affairs and protection |
Why Relevant: The bill mandates age verification for social media access and harmful content.
Mechanism of Influence: Entities must use 'anonymous age verification' or 'standard age verification' to gate access to platforms and content.
Evidence:
Ambiguity Notes: 'Standard age verification' is broadly defined as any 'commercially reasonable method,' leaving implementation details to the platforms.
Legislation ID: 252533
Bill URL: View Bill
This legislation amends existing laws to mandate warning labels on social media platforms identified as having addictive features. It reflects findings from health authorities regarding the detrimental effects of excessive social media use on mental health, particularly among adolescents. The bill seeks to inform users about potential mental health risks associated with prolonged use of these platforms.
| Date | Action |
|---|---|
| 2026-01-12 | reported referred to codes |
| 2026-01-07 | referred to consumer affairs and protection |
| 2026-01-06 | referred to consumer affairs and protection |
Why Relevant: The bill contains an age-verification or assessment component.
Mechanism of Influence: Section 1525(5) provides an exemption for operators who have 'reasonably determined' a user is over seventeen, which functionally necessitates the implementation of age-verification or assessment technologies to distinguish covered users from adults.
Evidence:
Ambiguity Notes: The term 'reasonably determined' is not defined, leaving the specific technology or method of age verification to the discretion of the operator or future rulemaking by the attorney general.
Why Relevant: The bill targets specific platform design features and recommender systems for regulatory intervention.
Mechanism of Influence: Like the EU Chat Act's focus on recommender systems as risk factors, this bill identifies 'addictive feeds' and 'autoplay' as triggers for mandatory 'mitigation' in the form of warning labels.
Evidence:
Ambiguity Notes: While it targets the same features, the 'mitigation' is limited to a disclosure (warning label) rather than a requirement to alter the underlying algorithm or conduct a formal risk assessment of content.
Legislation ID: 162997
Bill URL: View Bill
House Bill 805 establishes definitions for biological sex, limits state funding for gender transition procedures, and aims to prevent sexual exploitation through strict regulations on online pornography. The bill mandates that all state laws reflect a binary understanding of sex and introduces measures to protect minors and women from exploitation, including consent requirements for publishing pornographic images. Additionally, it allows students to be excused from discussions that conflict with their religious beliefs and ensures parental oversight of educational materials.
| Date | Action |
|---|---|
| 2025-07-29 | Placed on Todays Calendar |
| 2025-07-29 | Veto Overridden |
| 2025-07-29 | Veto Received From House |
| 2025-07-03 | Placed On Cal For 07/29/2025 |
| 2025-07-03 | Received from the Governor |
| 2025-07-03 | Vetoed 07/03/2025 |
| 2025-06-27 | Pres. To Gov. 6/27/2025 |
| 2025-06-26 | Ratified |
Why Relevant: The bill mandates a form of compelled detection and filtering to prevent the re-dissemination of prohibited content.
Mechanism of Influence: By requiring operators to block 'any altered or edited version' of a removed image, the law functionally compels the use of hashing or perceptual scanning technologies to identify and prevent re-uploads that are not exact matches.
Evidence:
Ambiguity Notes: The term 'altered or edited version' is not technically defined, leaving platforms to determine the threshold of similarity required for blocking, which typically necessitates 'state of the art' detection tools.
Why Relevant: The bill imposes strict removal and blocking orders with short compliance windows.
Mechanism of Influence: Operators must remove content within 72 hours of a request from an 'eligible person' or law enforcement, mirroring the rapid takedown requirements seen in the EU Child Act.
Evidence:
Ambiguity Notes: The 72-hour window is a hard limit, which may pressure platforms to automate removals to avoid the $10,000 per day penalties.
Why Relevant: The bill mandates age verification and identity documentation for content uploaded to platforms.
Mechanism of Influence: Operators must verify that every individual in a 'pornographic image' is at least 18 years old by obtaining valid government identification and written consent from the user or entity seeking to publish.
Evidence:
Ambiguity Notes: While focused on the age of performers, the burden of 'verifying' these details falls on the 'online entity operator' for any user-submitted content.
Why Relevant: The bill requires the creation of internal compliance infrastructure and designated points of contact.
Mechanism of Influence: Platforms must establish formal procedures for content removal and designate specific employees to manage these requests, creating a mandatory administrative layer for content moderation.
Evidence:
Ambiguity Notes: None
This bill establishes the Social Media Control in Information Technology Act, which mandates that social media platforms provide clear disclosures regarding data collection and usage, particularly for minors. It requires platforms to implement user-friendly mechanisms for privacy rights, prohibits the use of minors data in algorithmic recommendations, and sets default privacy settings to protect young users. Additionally, it holds operators accountable for non-compliance and creates a registry for privacy policies.
| Date | Action |
|---|---|
| 2025-06-17 | Reptd Fav Com Substitute |
| 2025-06-17 | Re-ref Com On Appropriations |
| 2025-04-10 | Passed 1st Reading |
| 2025-04-10 | Ref to the Com on Commerce and Economic Development, if favorable, Appropriations, if favorable, Rules, Calendar, and Operations of the House |
| 2025-04-09 | Filed |
Why Relevant: The bill requires platforms to determine the age of users to restrict algorithmic targeting.
Mechanism of Influence: Operators must 'reasonably determine' if a user is a minor before using personal information in recommendations, though it explicitly allows for 'self-attestation' as a defense against liability.
Evidence:
Ambiguity Notes: While it requires age determination, the standard is lower than the 'reliable' verification often seen in EU-style mandates, as it protects operators from liability if a minor falsely attests to their age.
Why Relevant: The bill mandates specific design changes and 'controls' to mitigate risks associated with social media use for minors.
Mechanism of Influence: Platforms must implement 'comprehensive and effective controls' to prevent minors' data from entering recommendation systems and must disable features like autoplay and geolocation by default.
Evidence:
Ambiguity Notes: The 'controls' are focused on data privacy and 'addiction' (e.g., autoplay) rather than the detection or mitigation of illegal content or solicitation.
Why Relevant: The bill establishes a compliance and oversight body, though its focus is distinct from the EU's coordinating authorities.
Mechanism of Influence: It creates the North Carolina Data Privacy Task Force to report on mental health and social media issues and requires platforms to provide a registry of privacy policies.
Evidence:
Ambiguity Notes: The task force is advisory and focused on mental health and privacy, not on overseeing detection orders or CSAM reporting pipelines.
Legislation ID: 232842
Bill URL: View Bill
House Bill No. 628 establishes a regulatory framework for independent verification organizations in Ohio, specifically targeting the verification of artificial intelligence applications and models. The bill defines key terms, outlines the licensing process, and sets forth the responsibilities and requirements for both the verification organizations and the developers or deployers of AI technologies. It also includes provisions for the establishment of an advisory council to oversee the implementation and effectiveness of the verification process.
| Date | Action |
|---|---|
| 2025-12-11 | Introduced |
Why Relevant: The bill mandates a framework for ongoing risk assessment and mitigation plans for AI technologies, which mirrors the structural requirements of the EU CSA Act.
Mechanism of Influence: IVOs must evaluate 'technical and operational requirements' and 'ongoing monitoring of risks' for AI models to ensure 'acceptable levels of risk mitigation.'
Evidence:
Ambiguity Notes: While the structure is similar, the 'risks' defined in this bill pertain to 'personal injury and property damage' rather than child safety or content-related harms.
Why Relevant: The bill establishes a compliance infrastructure including an advisory council and reporting requirements to the Attorney General.
Mechanism of Influence: Creates an 'artificial intelligence safety advisory council' to oversee the licensing of IVOs and requires annual reports on AI capabilities and societal risks.
Evidence:
Ambiguity Notes: None
Legislation ID: 266871
Bill URL: View Bill
This bill establishes regulations for deployers of artificial intelligence (AI) chatbots, specifically those with human-like features. It mandates that such chatbots not be made available to minors, requires age verification systems, and allows for alternative versions of chatbots for younger users. Additionally, it outlines the responsibilities of deployers to prioritize user safety and well-being, as well as the penalties for non-compliance.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Representative Maynard |
| 2026-02-02 | First Reading |
Why Relevant: The bill mandates age verification and age certification systems for AI services.
Mechanism of Influence: Deployers are legally required to gate access to specific AI features using 'reasonable age certification systems' and 'age verification systems' to ensure minors cannot interact with prohibited chatbot types.
Evidence:
Ambiguity Notes: The term 'reasonable' is not defined, leaving open whether this requires high-assurance identity verification or simpler methods.
Why Relevant: The bill compels the detection and reporting of specific user-generated content/interactions.
Mechanism of Influence: Deployers must 'detect' and 'report' emergency situations (defined as intent to harm self or others). This functionally requires the operation of monitoring or scanning technologies within the chatbot interface to identify these indicators.
Evidence:
Ambiguity Notes: The requirement to 'detect' implies a proactive monitoring duty, though the bill does not specify the technology (e.g., keyword vs. semantic AI analysis).
Why Relevant: The bill mandates risk mitigation and safety prioritization over other business interests.
Mechanism of Influence: Similar to the EU Chat Act's mitigation plans, this bill requires deployers to 'mitigate' detected harms in a way that 'prioritizes the safety and well-being of users over the deployers other interests.'
Evidence:
Ambiguity Notes: The prioritization of safety over 'other interests' could be interpreted as a mandate to override privacy or encryption if they interfere with detection/mitigation.
Why Relevant: The bill establishes a compliance infrastructure with civil penalties and private rights of action.
Mechanism of Influence: It creates a legal framework for enforcement via the Attorney General and allows for class-action lawsuits by parents/minors for non-compliance.
Evidence:
Ambiguity Notes: None
Legislation ID: 269132
Bill URL: View Bill
House Bill 4083 introduces regulations for AI chatbots in Oklahoma, focusing on preventing minors from accessing chatbots with human-like features. It mandates deployers to implement age verification systems, restricts access to social AI companions for minors, and outlines conditions under which therapeutic chatbots can be used by minors. The bill also establishes legal consequences for violations and emphasizes the need for safety measures in emergency situations.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Representative Alonso-Sandoval |
| 2026-02-02 | First Reading |
Why Relevant: The bill mandates age-verification systems to gate access to specific AI functionalities.
Mechanism of Influence: Deployers are required to 'implement reasonable age verification systems' to ensure minors do not access chatbots with human-like features or social AI companions.
Evidence:
Ambiguity Notes: The term 'reasonable age verification' is not defined, leaving open whether this requires government ID, biometric analysis, or third-party verification services.
Why Relevant: The bill compels the detection and reporting of specific content within user-chatbot interactions.
Mechanism of Influence: Deployers must 'detect, promptly respond to, report, and mitigate emergency situations,' which are defined as users indicating intent to harm themselves or others. This necessitates active monitoring of conversation content.
Evidence:
Ambiguity Notes: While focused on 'emergency situations' rather than CSAM, the statutory duty to 'detect' and 'report' creates a monitoring infrastructure similar to the EU's detection orders.
Why Relevant: The bill imposes data collection and minimization standards linked to the deployer's 'legitimate purpose.'
Mechanism of Influence: It restricts data collection to what is 'adequate,' 'relevant,' and 'necessary' for a legitimate purpose, while also requiring the storage of information needed to fulfill the detection/reporting mandate.
Evidence:
Ambiguity Notes: The tension between 'data minimization' and the mandate to 'detect' and 'report' emergency situations may force providers to retain more interaction metadata than otherwise necessary for privacy.
Legislation ID: 269135
Bill URL: View Bill
Senate Bill 1521 establishes regulations for artificial intelligence chatbots, defining key terms and prohibiting the creation of chatbots that could harm minors. It mandates user account creation, age verification processes, and data protection measures for covered entities. The bill also grants enforcement authority to the Attorney General, allowing for civil penalties for violations.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Senator Hamilton |
| 2026-02-02 | First Reading |
Why Relevant: The bill mandates a highly intrusive age-verification infrastructure similar to the EU Chat Act's requirements for identifying child users.
Mechanism of Influence: Section 3 requires covered entities to verify the age of every user by requiring the 'upload of a valid state-issued form of identification' (Section 1, 5) before allowing access to AI chatbot services.
Evidence:
Ambiguity Notes: The definition of 'reasonable age verification' is strictly tied to government ID, leaving no room for less intrusive technical assessments or privacy-preserving methods.
Why Relevant: The bill imposes a liability standard based on the 'risk' of solicitation, functionally compelling providers to implement mitigation and filtering features.
Mechanism of Influence: Section 2 makes it unlawful to provide a chatbot that 'poses a risk of soliciting, encouraging, or inducing minors' to engage in sexually explicit conduct. This creates a statutory duty for providers to assess and mitigate the output of their models to prevent such interactions.
Evidence:
Ambiguity Notes: The term 'poses a risk' is broad and may lead to proactive scanning or restrictive filtering of all AI-generated content to avoid 'reckless disregard' liability.
Why Relevant: The bill includes data preservation and internal compliance requirements related to the age-verification process.
Mechanism of Influence: Section 3(F) requires entities to maintain data security and 'retain age verification data' for as long as necessary to 'maintain compliance with this act,' creating a mandate for storing sensitive user identity metadata.
Evidence:
Ambiguity Notes: While it limits retention to what is 'reasonably necessary,' the requirement to maintain compliance with the act may lead to long-term storage of ID-linked data for audit purposes.
Legislation ID: 268177
Bill URL: View Bill
This legislation defines terms related to social media usage, particularly concerning minors. It allows minors or their guardians to sue social media companies for mental health issues stemming from excessive use of their platforms. The bill outlines the criteria for recovering damages and establishes a rebuttable presumption regarding the causation of mental health outcomes. It also sets forth requirements for social media companies to limit minor users engagement and specifies the rights and protections that cannot be waived or limited.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Senator Jech |
| 2026-02-02 | First Reading |
Why Relevant: Mandated mitigation of product features and algorithms.
Mechanism of Influence: The bill incentivizes platforms to modify their 'curation algorithm' and 'engagement driven design elements' to avoid a rebuttable presumption of liability for mental health harms. This mirrors the EU Act's focus on mandated mitigation changes to recommender systems and product features.
Evidence:
Ambiguity Notes: While the EU Act focuses on mitigating CSAM risks, this bill focuses on 'excessive use' and mental health, yet uses the same mechanism of compelling product design changes.
Why Relevant: Implicit requirement for age verification and parental consent.
Mechanism of Influence: To qualify for the safe harbor protections, platforms must identify 'minor users' and obtain 'parental consent,' effectively requiring the implementation of age-verification or age-estimation infrastructure.
Evidence:
Ambiguity Notes: None
Legislation ID: 268190
Bill URL: View Bill
This bill introduces a framework for social media companies operating in Oklahoma to implement measures aimed at protecting minors. It defines key terms related to social media use and sets requirements for age verification, data segregation, and parental consent. The bill mandates that social media platforms take specific actions to safeguard minors personal information and limit their access to services, while also providing mechanisms for parental supervision and control.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Senator Jech |
| 2026-02-02 | First Reading |
Why Relevant: The bill mandates a highly intrusive form of age verification, a core pillar of the EU proposal's infrastructure for identifying child users.
Mechanism of Influence: Social media companies are required to 'freeze all accounts' and only restore functionality after a user provides 'age data that is verifiable' via the 'upload of a valid state-issued form of identification.'
Evidence:
Ambiguity Notes: The definition of 'reasonable age verification process' is prescriptive, explicitly rejecting self-declaration or simple birthdate entry.
Why Relevant: The bill compels specific product-level 'mitigation' measures to protect minors, similar to the EU Act's risk-mitigation requirements.
Mechanism of Influence: Platforms must disable features that 'prolong engagement,' such as autoplay and infinite scroll, and restrict direct messaging capabilities to 'connected accounts' only.
Evidence:
Ambiguity Notes: None
Why Relevant: The bill establishes data segregation and internal control requirements for the personal information collected during the age-verification process.
Mechanism of Influence: Companies must 'segregate any personal information gathered specifically for reasonable age verification' and are prohibited from using it for other purposes, creating a specific compliance silo.
Evidence:
Ambiguity Notes: None
Legislation ID: 269137
Bill URL: View Bill
This bill creates the Oklahoma Children’s Internet Protection Act, which aims to safeguard minors from potentially harmful online interactions and agreements. It prohibits interactive computer service providers from entering into contracts with minors without the express consent of their parents or legal guardians, outlines methods for obtaining such consent, and establishes penalties for violations of these provisions. Additionally, the bill provides guidelines for age verification and specifies exceptions for news organizations and certain providers.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Senator Grellner |
| 2026-02-02 | First Reading |
Why Relevant: The bill mandates age verification and parental consent for interactive computer services, which is a core element of the regulatory framework for child safety online.
Mechanism of Influence: Providers must use 'reasonable age-verification methods' and obtain 'prior express consent' through specific methods like ID collection or videoconferencing.
Evidence:
Ambiguity Notes: The term 'reasonable age-verification methods' is not strictly defined, leaving the specific technology choice to the provider but subject to Attorney General oversight.
Why Relevant: The bill restricts interpersonal communication by prohibiting adults from communicating with minors on the service.
Mechanism of Influence: This creates a statutory duty for platforms to monitor or gate communications to ensure no adult-to-minor interaction occurs, which has significant implications for private messaging.
Evidence:
Ambiguity Notes: The bill does not specify how a provider should determine if a user is an adult or a minor for the purpose of communication without constant monitoring or universal age verification.
Why Relevant: The bill imposes liability for allowing minors to access 'harmful' content or exposing their personal data.
Mechanism of Influence: To avoid liability, providers may be compelled to implement scanning or filtering technologies to detect 'material harmful to minors' or to prevent the display of a minor's 'persona.'
Evidence:
Ambiguity Notes: The prohibition on making a minor's 'persona' accessible could be interpreted as requiring the removal of public profiles or the implementation of strict privacy defaults for all users suspected of being minors.
Legislation ID: 267073
Bill URL: View Bill
This legislation amends existing statutes related to obscenity and child sexual abuse material. It introduces new definitions, modifies penalties for offenses, and establishes civil remedies for individuals harmed by the production or distribution of unlawful pornography. The bill also mandates actions by internet service providers to combat such materials.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Senator Deevers |
| 2026-02-02 | First Reading |
Why Relevant: The bill mandates the implementation of filtering technology by service providers to prevent access to specific content categories.
Mechanism of Influence: Section 4 creates a statutory duty for ISPs to operate technology that detects and blocks access to prohibited content, effectively requiring a state-wide filtering infrastructure.
Evidence:
Ambiguity Notes: The term 'filtering technology' is undefined, leaving open whether this requires DNS-level blocking, deep packet inspection (DPI), or other intrusive network-level scanning methods.
Why Relevant: The bill includes broad definitions and civil enforcement mechanisms that incentivize proactive platform monitoring.
Mechanism of Influence: By allowing 'any person' to bring civil actions for statutory damages against those who 'aid or abet' the distribution of 'unlawful pornography,' the bill creates significant legal pressure for platforms and intermediaries to implement aggressive detection and removal systems.
Evidence:
Ambiguity Notes: The definition of 'unlawful pornography' includes broad categories of 'lewd exhibition' and 'sexual conduct' that may overlap with protected speech, potentially leading to over-blocking by providers seeking to avoid liability.
Legislation ID: 269139
Bill URL: View Bill
Senate Bill 2085 introduces comprehensive regulations regarding artificial intelligence technology in Oklahoma. It defines key terms, prohibits state entities from contracting with foreign adversaries, and establishes rights for individuals concerning AI use. The bill also includes specific provisions to protect minors from inappropriate interactions with AI chatbots and mandates transparency from AI companies regarding data use and user interactions.
| Date | Action |
|---|---|
| 2026-02-02 | Authored by Senator Hamilton |
| 2026-02-02 | First Reading |
Why Relevant: The bill mandates age-gating and parental consent for specific types of interactive platforms.
Mechanism of Influence: By requiring that a 'companion chatbot platform shall prohibit a minor from entering into a contract... unless the minor’s parent or legal guardian provides consent,' the bill effectively mandates the implementation of age verification or age assessment technologies to distinguish between adult and minor users.
Evidence:
Ambiguity Notes: The bill does not specify the 'reliable' method for verifying age or parental status, leaving platforms to determine the level of data collection required to comply.
Why Relevant: The bill requires platforms to implement content mitigation measures to prevent the exposure of minors to harmful material.
Mechanism of Influence: Platforms must 'institute reasonable measures' to prevent the AI from 'producing or sharing materials harmful to minors.' This creates a statutory duty for proactive content filtering and output moderation similar to the mitigation obligations in the EU proposal.
Evidence:
Ambiguity Notes: 'Reasonable measures' is not defined, which may lead platforms to implement aggressive automated scanning and filtering of AI outputs to avoid the $50,000 per-violation penalty.
Why Relevant: The bill compels the creation of a monitoring and access infrastructure for private interactions.
Mechanism of Influence: The platform is required to provide parents with 'copies of all interactions between the minor account holder and the companion chatbot.' This necessitates the retention and accessibility of dialogue logs that might otherwise be private or ephemeral.
Evidence:
Ambiguity Notes: While this applies to AI-to-human interaction rather than human-to-human messaging, it establishes a precedent for compelled access to private digital dialogues for the purpose of child safety.
Why Relevant: The bill includes data preservation and deletion requirements tied to account management.
Mechanism of Influence: Platforms must delete personal information upon account termination unless law requires otherwise, and must provide notifications for self-harm, mirroring the 'internal controls' and 'reporting' signals of the EU Act.
Evidence:
Ambiguity Notes: None
The App Store Accountability Act seeks to regulate app stores by mandating age verification for users, requiring parental consent for minors, and ensuring that content ratings are clear and accurate. It establishes guidelines for app store providers and developers to follow, with enforcement mechanisms and potential penalties for non-compliance, while emphasizing the importance of parental involvement in minors app usage.
| Date | Action |
|---|---|
| 2026-01-13 | Member(s) request name added as sponsor: Oremus |
| 2025-04-23 | Member(s) request name added as sponsor: Martin |
| 2025-01-14 | Introduced and read first time ( House Journal-page 192 ) |
| 2025-01-14 | Referred to Committee on Judiciary ( House Journal-page 192 ) |
| 2024-12-05 | Prefiled |
| 2024-12-05 | Referred to Committee on Judiciary |
Why Relevant: The bill mandates strict age verification for all users, a core pillar of the EU proposal's access control framework.
Mechanism of Influence: Section 37-31-20(A) requires providers to 'determine the age category for every individual... and verify that user's age,' which necessitates the implementation of identity-linked or biometric age-assurance technologies.
Evidence:
Ambiguity Notes: The term 'commercially available methods' is undefined, leaving open whether this requires government ID upload, facial age estimation, or credit card verification.
Why Relevant: The bill imposes gatekeeping obligations on app stores to regulate third-party app access and content.
Mechanism of Influence: Section 37-31-20(B) compels app stores to act as the primary enforcement point for parental consent and to provide 'mechanisms for parents to block the download of any apps... unsuitable for a particular minor's age category.'
Evidence:
Ambiguity Notes: The requirement for 'download-by-download' consent could significantly disrupt app store functionality and user experience for minors.
Why Relevant: It establishes a technical compliance infrastructure for sharing age data between platforms and developers.
Mechanism of Influence: Section 37-31-20(E) requires app stores to create a 'real-time application programming interface' (API) to transmit age 'signals' to developers, creating a persistent tracking mechanism for a user's age status across different apps.
Evidence:
Ambiguity Notes: The 'signal' (defined in Section 37-31-10(11)) creates a standardized data flow that could be used for broader profiling beyond simple age gating.
Legislation ID: 196757
Bill URL: View Bill
Bill 3431 aims to amend the South Carolina Code of Laws by introducing new regulations for social media companies that cater to minors. It establishes definitions, outlines requirements for protecting minors personal data, restricts access during certain hours, and mandates parental controls. The bill also addresses consumer complaints and provides for enforcement mechanisms, ensuring that social media platforms prioritize the safety and well-being of minor users.
| Date | Action |
|---|---|
| 2026-01-14 | Returned to Senate with amendments |
| 2026-01-14 | Roll call Yeas-112 Nays-0 |
| 2026-01-14 | Senate amendment amended |
| 2025-05-12 | Scriveners error corrected |
| 2025-05-06 | Amended ( Senate Journal-page 36 ) |
| 2025-05-06 | Read third time and returned to House with amendments ( Senate Journal-page 36 ) |
| 2025-05-06 | Roll call Ayes-45 Nays-0 ( Senate Journal-page 36 ) |
| 2025-05-05 | Scriveners error corrected |
Why Relevant: The bill mandates provider risk mitigation through a statutory duty of care and independent audits.
Mechanism of Influence: Section 39-80-20(A) requires services to 'exercise reasonable care' to prevent harms such as 'compulsive usage' and 'severe psychological harm.' This is reinforced by Section 39-80-70, which requires an annual 'public report prepared by an independent third-party auditor' evaluating 'covered design features' and 'algorithms.'
Evidence:
Ambiguity Notes: The term 'reasonable care' is broad and may be interpreted by regulators to require proactive monitoring or modification of core service features to mitigate vaguely defined psychological harms.
Why Relevant: The bill incorporates age verification and assessment frameworks.
Mechanism of Influence: While not mandating a specific technology, the bill regulates the data used for 'age verification or estimation' and requires services to report their methods to the Attorney General.
Evidence:
Ambiguity Notes: By requiring services to report 'age verification or estimation methods used,' the bill effectively compels the adoption of such systems to comply with the 'duty of care' for known minors.
Why Relevant: The bill requires centralized reporting mechanisms for harms to minors.
Mechanism of Influence: Services must establish systems for parents, minors, and schools to report 'harms,' and these reports must be accounted for in the annual audit submitted to the state.
Evidence:
Ambiguity Notes: The requirement to report 'harms that pose an imminent threat' could overlap with criminal reporting duties, though the bill focuses on civil design safety.
Why Relevant: The bill includes provisions that touch upon the privacy of communications, potentially impacting encrypted services.
Mechanism of Influence: The definition of 'sensitive personal data' includes the 'contents of an individuals mail, email, and text messages.' The duty to prevent 'harms' in these communications could create pressure to scan content.
Evidence:
Ambiguity Notes: Section 39-80-10(17)(d) excludes messages where the 'business is the intended recipient,' but the general duty to prevent 'severe psychological harm' or 'physical injury' (39-80-20) may conflict with end-to-end encryption if harms occur within private messages.
Legislation ID: 244736
Bill URL: View Bill
The Stop Harm from Addictive Social Media (SHASM) Act establishes regulations for covered social media platforms to protect minors from addictive features and harmful content. It mandates age verification, parental consent, and the implementation of privacy settings to ensure the safety of children using these platforms.
| Date | Action |
|---|---|
| 2026-01-13 | Introduced and read first time |
| 2026-01-13 | Referred to Committee on Judiciary |
| 2025-12-16 | Prefiled |
| 2025-12-16 | Referred to Committee on Judiciary |
Why Relevant: The bill mandates sophisticated age estimation and verification, a core pillar of the EU Chat Act's infrastructure for protecting minors.
Mechanism of Influence: Platforms must estimate ages with 80% to 90% confidence based on usage 'triggers' (25 and 50 hours of use) and require birth dates and parental consent for account creation.
Evidence:
Ambiguity Notes: The term 'reasonable means' for age estimation is not strictly defined, potentially allowing for biometric or behavioral analysis to meet the 90% confidence threshold.
Why Relevant: The bill compels 'mitigation' through product design changes, similar to the EU Act's focus on recommender systems.
Mechanism of Influence: It bans specific 'addictive features' like infinite scrolling and auto-play videos for minors, effectively forcing a different UI/UX for a specific class of users.
Evidence:
Ambiguity Notes: The definition of 'addictive features' is specific but includes broad categories like 'personal metrics' and 'push notifications' that may affect core platform functionality.
Why Relevant: The bill establishes a compliance infrastructure including parental controls and account termination protocols.
Mechanism of Influence: Platforms must provide parents with tools to monitor time, set limits, and request account termination within 14 days.
Evidence:
Ambiguity Notes: The verification of 'legal guardianship' for parental requests is left to the platform's own internal processes.
Legislation ID: 244789
Bill URL: View Bill
Bill 4665 amends the South Carolina Code of Laws by adding a new section that mandates internet service providers to implement filtering measures against adult content. It outlines the definitions, requirements for filtering, procedures for deactivation of filters by consumers, and penalties for non-compliance with the provisions.
| Date | Action |
|---|---|
| 2026-01-13 | Introduced and read first time |
| 2026-01-13 | Referred to Committee on Judiciary |
| 2025-12-16 | Prefiled |
| 2025-12-16 | Referred to Committee on Judiciary |
Why Relevant: The bill mandates a state-wide age verification regime for internet access.
Mechanism of Influence: Section 37-1-320(E)(2) requires 'reasonable verification' that a user is 18 or older before an ISP can deactivate the mandatory filter, effectively requiring age-gating for unfiltered internet access.
Evidence:
Ambiguity Notes: The term 'reasonable verification' is not defined, leaving the specific technical requirements for identity or age proofing to the discretion of the ISP or future administrative rules.
Why Relevant: The bill compels providers to implement content filtering and blocking technologies.
Mechanism of Influence: ISPs must use 'reasonable commercially available means' to filter content, including 'in-network filtering' to 'prevent the display of or access to adult content.'
Evidence:
Ambiguity Notes: While 'adult content' is tied to existing legal definitions of 'harmful to minors,' the requirement for 'in-network filtering' could involve various levels of network-level inspection or DNS-based blocking.
Legislation ID: 244777
Bill URL: View Bill
Bill 736 seeks to amend the South Carolina Code by adding a new section that mandates online companies with minor users to offer parental control options. This includes the ability for parents to opt out of content that is sexual in nature or features transgender individuals. Noncompliance would result in penalties for the companies, and individuals would have the right to pursue civil actions for violations.
| Date | Action |
|---|---|
| 2026-01-13 | Introduced and read first time |
| 2026-01-13 | Referred to Committee on Labor, Commerce and Industry |
| 2025-12-10 | Prefiled |
| 2025-12-10 | Referred to Committee on Labor, Commerce and Industry |
Why Relevant: The bill implicitly requires age verification or assessment to identify 'minor users' and verify parental relationships.
Mechanism of Influence: To comply with the mandate for 'minor users,' platforms must implement systems to reliably distinguish children from adults and link them to a verified parent, a core component of the EU Chat Act's infrastructure.
Evidence:
Ambiguity Notes: The bill does not specify the 'reliable technology' or method for age verification, leaving platforms to determine how to identify minors and parents.
Why Relevant: The bill mandates a form of exposure-limiting or content-filtering mitigation.
Mechanism of Influence: Platforms must develop or modify recommender systems and content delivery algorithms to enable an 'opt out' for specific categories of content, effectively requiring a classification system for all content available to minors.
Evidence:
Ambiguity Notes: The phrase 'sexual in nature' and 'content that features transgender individuals' is broad and lacks statutory definitions, potentially requiring aggressive automated scanning or human review to ensure compliance.
Legislation ID: 260262
Bill URL: View Bill
Senate Bill 1700, known as the Curbing Harmful AI Technology (CHAT) Act, amends Tennessee Code to introduce regulations governing artificial intelligence systems and companion chatbots. It defines key terms, outlines safety and design requirements, mandates transparency and data privacy protections, and establishes enforcement mechanisms to hold developers and deployers accountable for violations. The bill seeks to ensure that AI technologies do not harm minors and provides a framework for addressing issues related to mental health and user safety.
| Date | Action |
|---|---|
| 2026-01-15 | Filed for introduction |
Why Relevant: The bill mandates content detection protocols for specific user expressions.
Mechanism of Influence: Section 47-18-5905 requires developers to implement a 'protocol to take reasonable efforts for detecting and addressing suicidal ideation or expressions of self-harm.' This necessitates active monitoring or scanning of user-to-AI interactions.
Evidence:
Ambiguity Notes: While focused on mental health, the requirement to 'detect' specific expressions in a natural language interface mirrors the technical requirements of scanning for prohibited content.
Why Relevant: The bill includes safety testing and transparency requirements that function as a form of risk assessment and mitigation.
Mechanism of Influence: Section 47-18-5907(b) requires developers to 'publish safety test findings' conducted to ensure the AI does not encourage illegal activity or CSAM creation, effectively mandating public disclosure of risk mitigation efficacy.
Evidence:
Ambiguity Notes: The bill does not define the specific methodology for 'safety testing,' leaving the depth of the assessment to the developer or future regulation.
Why Relevant: The bill establishes a centralized reporting structure for harmful or illegal AI interactions.
Mechanism of Influence: Section 47-18-5907(c) mandates quarterly reports to the Attorney General regarding the frequency of AI outputs related to 'suicide, self-harm, suicidal ideation, harming others, or illegal activity.'
Evidence:
Ambiguity Notes: The reporting is currently aggregate (number of times) rather than individual user data, but it establishes the Attorney General as the central oversight body for these harms.
Why Relevant: The bill creates a functional requirement for age verification or assessment.
Mechanism of Influence: Because Section 47-18-5903 prohibits providing specific chatbot features to 'minors' and Section 47-18-5906 requires parental consent for training on minor data, operators must 'reliably identify child users' to remain compliant.
Evidence:
Ambiguity Notes: The bill does not prescribe a specific technology for age verification, but the liability structure (civil penalties and private right of action) necessitates a robust identification mechanism.
This legislation updates provisions related to sexual exploitation and liability for obscenity and child sexual abuse material. It establishes definitions for key terms, outlines exemptions for legal representation, and specifies the rights of individuals to seek legal recourse against those who depict or distribute harmful material. Additionally, it clarifies the responsibilities of internet service providers and other entities regarding content they do not create.
| Date | Action |
|---|---|
| 2026-01-14 | House/ received bill from Legislative Research |
| 2026-01-06 | Bill Numbered but not Distributed |
| 2026-01-06 | Numbered Bill Publicly Distributed |
Why Relevant: The bill incentivizes the implementation of detection and scanning technologies through civil liability frameworks.
Mechanism of Influence: Section 78B-3-1004(3)(b) allows lawsuits against cloud service providers if they 'did not take reasonable precautions to prevent' CSAM or obscenity from appearing. This creates a functional mandate for automated scanning and filtering to mitigate legal risk.
Evidence:
Ambiguity Notes: The term 'reasonable precautions' is not defined, leaving it to courts to determine if failing to use 'state of the art' scanning or client-side analysis constitutes a lack of precaution.
Why Relevant: The bill provides immunity for entities that engage in proactive detection and reporting, mirroring the 'voluntary' but incentivized scanning regimes of the EU framework.
Mechanism of Influence: Section 76-5b-201(6)(a)(ii) protects entities from liability if they are 'implementing a policy of attempting to prevent the presence of child sexual abuse material... or of detecting and reporting' it.
Evidence:
Ambiguity Notes: The scope of 'detecting' is broad and could encompass various forms of intrusive monitoring of 'intangible property' (digital data).
Why Relevant: The bill defines 'reasonable age verification methods' and 'transactional data' for identity verification.
Mechanism of Influence: Section 78B-3-1001(14) outlines specific methods for verifying a user is 18 or older, including 'digitized information cards' and 'third-party age verification services.'
Evidence:
Ambiguity Notes: While defined, the bill primarily links these to 'material harmful to minors' rather than a universal mandate for all messaging, though the definitions align with the EU's focus on reliable age assessment.
Why Relevant: The bill includes data preservation and reporting elements.
Mechanism of Influence: Section 76-5b-201(6)(a)(i) grants immunity for the performance of 'reporting or data preservation duties required under federal or state law.'
Evidence:
Ambiguity Notes: None
This bill amends and enacts provisions regarding the accessibility of sensitive material through digital instructional material in a school setting. It mandates local education agencies to provide information to parents, utilize tools to identify sensitive materials, and manage access to such materials effectively. The bill also establishes a private right of action for individuals affected by the misuse of sensitive materials.
| Date | Action |
|---|---|
| 2026-01-14 | House/ received bill from Legislative Research |
| 2026-01-08 | Bill Numbered but not Distributed |
| 2026-01-08 | Numbered Bill Publicly Distributed |
Why Relevant: The bill mandates the use of automated technology to detect and screen content for prohibited material, mirroring the EU CSA Act's detection obligations.
Mechanism of Influence: The state board is required to contract with a reviewer that 'uses technology, including artificial intelligence assisted analysis, to screen the instructional materials' for violations. (Section 53G-10-103(8)(e)(ii))
Evidence:
Ambiguity Notes: While 'instructional materials' is defined, the use of AI for screening is a broad mandate that could involve scanning vast amounts of digital content.
Why Relevant: The bill contains an explicit mandate to bypass encryption to facilitate content filtering and detection.
Mechanism of Influence: Local Education Agencies (LEAs) must ensure that filtering software is effective by requiring personnel to 'decrypt websites to ensure the efficacy of the filtering, including any online school library and other encrypted websites.' (Section 53G-10-103(11)(h)(iii))
Evidence:
Ambiguity Notes: The phrase 'other encrypted websites' is broad and could encompass any HTTPS traffic or encrypted communication platforms accessed on school devices.
Why Relevant: It establishes a centralized removal and blocking mechanism where a determination by a few local entities triggers a mandatory statewide takedown.
Mechanism of Influence: If a specific threshold of LEAs (three districts or two districts and five charters) determines a material is 'objective sensitive material,' every LEA in the state must remove it. (Section 53G-10-103(7))
Evidence:
Ambiguity Notes: None
Why Relevant: The bill imposes direct compliance and removal obligations on third-party digital vendors, similar to service provider duties in the EU Act.
Mechanism of Influence: Vendors must remove access to sensitive segments of digital material upon determination by an LEA; failure to comply after three instances results in mandatory contract rescission. (Section 53G-10-103(12)(b))
Evidence:
Ambiguity Notes: None
The Online Age Verification Amendments bill establishes a tax on gross receipts from sales and distributions of material harmful to minors in Utah. It mandates compliance notifications by commercial entities, creates a Teen Mental Health Restricted Account funded by the tax revenue, and outlines administrative procedures for monitoring and enforcing age verification requirements.
| Date | Action |
|---|---|
| 2026-01-07 | Senate/ received bill from Legislative Research |
| 2026-01-05 | Bill Numbered but not Distributed |
| 2026-01-05 | Numbered Bill Publicly Distributed |
Why Relevant: The bill mandates age verification compliance and state-level auditing of those mechanisms.
Mechanism of Influence: It requires commercial entities to 'certify that the entity... complies with the age verification requirements' and 'has implemented reasonable age verification methods' (Sec 78B-3-1004(4)).
Evidence:
Ambiguity Notes: The bill references Section 78B-3-1002 for specific age verification standards, which are not fully detailed in this text, leaving the technical 'reasonableness' of the methods to be defined by the division or existing law.
Why Relevant: The bill establishes a compliance and oversight infrastructure similar to the EU's coordinating authorities.
Mechanism of Influence: It authorizes the Division of Consumer Protection to 'monitor and audit compliance' (Sec 78B-3-1004(2)) and 'conduct audits of commercial entities to verify compliance' (Sec 78B-3-1004(5)).
Evidence:
Ambiguity Notes: The 'standards for monitoring and auditing' are left to future rulemaking by the division (Sec 78B-3-1004(6)).
Why Relevant: The bill creates a financial and administrative burden for platforms to operate, tied to their ability to gate access.
Mechanism of Influence: Entities must pay an 'annual notification fee of $500' (Sec 78B-3-1004(2)) and face 'administrative penalty of $1,000 for each day' of non-notification (Sec 78B-3-1004(7)).
Evidence:
Ambiguity Notes: The definition of 'substantial portion' of material harmful to minors (Sec 78B-3-1004(1)) determines which platforms are captured, which may be subject to broad interpretation.
Legislation ID: 258707
Bill URL: View Bill
This bill amends existing consumer protection laws and introduces a new chapter specifically addressing the use of artificial intelligence chatbots in consumer transactions. It outlines prohibited practices, such as misrepresentation and failure to disclose necessary information, that companies must adhere to when utilizing AI chatbots. Penalties for violations are also specified to ensure compliance and protect consumers.
| Date | Action |
|---|---|
| 2026-01-13 | Committee Referral Pending |
| 2026-01-13 | Prefiled and ordered printed; Offered 01-14-2026 26105121D |
Why Relevant: The bill mandates age verification or assessment for users of companion chatbots.
Mechanism of Influence: Operators are legally required to 'reasonably determine' that a user is not a minor before providing access to chatbots with certain capabilities.
Evidence:
Ambiguity Notes: The term 'reasonably determined' is not defined, leaving open the possibility of high-friction or privacy-invasive age-assurance technologies.
Why Relevant: The bill compels the detection of specific user content within private, personal dialogues.
Mechanism of Influence: Operators must implement a protocol that includes the 'detection of user expressions' related to self-harm or suicidal ideation.
Evidence:
Ambiguity Notes: While the detection mandate is currently limited to self-harm, it establishes a statutory requirement for automated scanning of user-to-AI communications.
Why Relevant: The bill mandates risk mitigation specifically targeting the creation of child sexual abuse materials (CSAM).
Mechanism of Influence: It prohibits providing chatbots to minors if the system is 'capable' of encouraging the creation of CSAM, effectively requiring developers to implement safety guardrails and content filtering.
Evidence:
Ambiguity Notes: The requirement that a chatbot not be 'capable' of such actions may imply a duty for ongoing monitoring and proactive technical restrictions.
Why Relevant: The bill establishes reporting and transparency obligations for safety-related incidents.
Mechanism of Influence: Operators must provide public quarterly reports on the frequency of harmful content generation and the use of mental health redirects.
Evidence:
Ambiguity Notes: None
Legislation ID: 258830
Bill URL: View Bill
This bill introduces a new chapter in the Code of Virginia that outlines the responsibilities of app store providers and developers regarding the handling of minors accounts and data. It mandates parental consent for minors, requires age verification, and sets penalties for non-compliance, thereby aiming to enhance the accountability of app stores in protecting young users.
| Date | Action |
|---|---|
| 2026-01-13 | Committee Referral Pending |
| 2026-01-13 | Prefiled and ordered printed; Offered 01-14-2026 26103955D |
Why Relevant: The bill mandates age verification for all account holders, a core component of the EU's child protection framework.
Mechanism of Influence: Providers must use 'commercially available methods' to verify the age category of every individual creating or holding an account, effectively ending anonymous access to app stores.
Evidence:
Ambiguity Notes: The term 'commercially available method' is not defined, potentially allowing for intrusive biometric or government ID-based checks.
Why Relevant: The bill imposes significant gatekeeping duties on app stores to regulate access for minors.
Mechanism of Influence: App store providers are required to block downloads and in-app purchases for minors unless 'verifiable parental consent' is obtained, shifting the burden of child safety enforcement to the platform level.
Evidence:
Ambiguity Notes: The requirement for 'renewed verifiable parental consent' after 'significant changes' to an app could lead to frequent access interruptions and continuous monitoring of app updates.
Why Relevant: The bill requires developers to categorize and label content, similar to the risk-mitigation and transparency features of the EU Act.
Mechanism of Influence: Developers must assign age ratings and provide 'content descriptions' that inform the app store's parental consent disclosures, creating a standardized suitability assessment for all software.
Evidence:
Ambiguity Notes: While the bill provides a safe harbor for developers using 'widely adopted industry standards,' it does not specify which standards are acceptable.
Why Relevant: The bill includes provisions for data preservation and internal controls regarding age-related data.
Mechanism of Influence: It mandates the protection of age verification data and limits its use to compliance purposes, while requiring the use of encryption for data transmission.
Evidence:
Ambiguity Notes: The requirement to maintain 'compliance records' implies a data retention obligation for sensitive age-verification information.
Why Relevant: The bill explicitly allows for the detection and blocking of 'harmful material,' though it does not mandate specific scanning technologies.
Mechanism of Influence: It provides a legal shield for providers who take measures to 'block, detect, or prevent' the distribution of harmful or unlawful material to minors.
Evidence:
Ambiguity Notes: While not a 'detection order,' this language encourages the implementation of filtering and detection systems to avoid liability.
Legislation ID: 258831
Bill URL: View Bill
House Bill No. 758 seeks to amend the existing consumer protection laws in Virginia by adding provisions specifically addressing the use of artificial intelligence chatbots when interacting with minors. The bill outlines prohibited practices for suppliers of such technology and establishes penalties for violations. It aims to ensure that minors are not subjected to deceptive practices and are provided with appropriate disclosures when engaging with AI chatbots.
| Date | Action |
|---|---|
| 2026-01-13 | Committee Referral Pending |
| 2026-01-13 | Prefiled and ordered printed; Offered 01-14-2026 26103964D |
Why Relevant: The bill mandates age verification for AI messaging and interaction services.
Mechanism of Influence: Section 59.1-615 requires deployers to 'implement reasonable age verification systems' to ensure that chatbots with human-like features or social companion functions are not accessible to minors.
Evidence:
Ambiguity Notes: The term 'reasonable age verification systems' is not defined, leaving the specific technical requirements (e.g., ID upload, biometric estimation) to the discretion of the deployer or future regulatory interpretation.
Legislation ID: 253030
Bill URL: View Bill
This bill amends existing sections of the Code of Virginia related to consumer data protection, particularly addressing the handling of personal data for children. It introduces new definitions and clarifies the requirements for obtaining consent from parents or guardians before processing personal data of children. Additionally, it outlines the scope of the bill and the exemptions applicable under certain circumstances.
| Date | Action |
|---|---|
| 2026-01-09 | Prefiled and ordered printed; Offered 01-14-2026 26100496D |
| 2026-01-09 | Referred to Committee on General Laws and Technology |
Why Relevant: The bill mandates a robust age-verification and parental-consent framework for all users under 18 years of age.
Mechanism of Influence: It requires controllers to gate service registration and data processing behind a verification wall, explicitly suggesting the use of government-issued IDs or payment system notifications to confirm a parent's identity.
Evidence:
Ambiguity Notes: The phrase 'reasonable efforts... taking into consideration available technology' is undefined, potentially allowing for the implementation of intrusive biometric or third-party identity-verification systems.
Why Relevant: The bill establishes a specific scope for 'Social media platforms' that facilitate user interaction and content creation.
Mechanism of Influence: By defining social media platforms based on their ability to 'interact socially' and 'create or post content,' the bill targets the same interpersonal communication services that are the focus of the EU Chat Act's safety obligations.
Evidence:
Ambiguity Notes: None
House Bill 2112 seeks to protect minors from accessing sexual material harmful to them by requiring commercial entities to implement reasonable age verification methods. The bill defines relevant terms, stipulates penalties for violations, and clarifies that news-gathering organizations are exempt from its provisions. It also mandates notices regarding youth health risks associated with adult content and ensures that no identifying information is retained during the age verification process.
| Date | Action |
|---|---|
| 2025-12-08 | Prefiled for introduction. |
Why Relevant: The bill mandates age verification for access to specific online content, which is a core component of the EU's approach to online safety and child protection.
Mechanism of Influence: Section 2 and Section 3 require commercial entities to implement 'reasonable age verification methods' such as digital ID or government-issued identification to gate access to adult content.
Evidence:
Ambiguity Notes: The term 'commercially reasonable method that relies on public or private transactional data' is broad and could encompass various third-party data-matching services.
This bill introduces regulations for operators of companion chatbots, requiring them to notify users about the nature of the interaction, implement protocols to prevent suicidal content, and provide referrals to crisis services. It also establishes civil liability for operators if their systems contribute to user harm, particularly in cases of suicide. The bill mandates annual reporting to the Department of Health on related incidents and protocols, and it outlines the responsibilities of operators regarding minors and the content generated by their systems.
| Date | Action |
|---|---|
| 2025-12-11 | Prefiled for introduction. |
Why Relevant: The bill mandates content detection and mitigation protocols for specific categories of harmful content.
Mechanism of Influence: Operators must maintain a 'protocol for preventing the production of suicidal ideation' (Sec. 2(2)(a)) and report on protocols used to 'detect, remove, and respond to instances of suicidal ideation' (Sec. 3(1)(b)).
Evidence:
Ambiguity Notes: While focused on suicide, the requirement to 'detect' user inputs mirrors the EU's compelled detection orders, potentially requiring scanning of natural language interactions.
Why Relevant: The bill includes specific protections for minors against sexually explicit content, a core pillar of the EU Chat Act.
Mechanism of Influence: Operators must 'institute reasonable measures to prevent its companion chatbot from producing visual material of sexually explicit conduct' (Sec. 2(3)(c)) for users known to be minors.
Evidence:
Ambiguity Notes: The term 'reasonable measures' is not defined, leaving open whether this requires proactive filtering or client-side analysis.
Why Relevant: It establishes a centralized reporting and oversight infrastructure.
Mechanism of Influence: Operators are required to 'annually report to the department' (Sec. 3(1)) of health regarding their detection protocols and the frequency of crisis referrals.
Evidence:
Ambiguity Notes: The Department of Health acts as a state-level coordinating body for AI safety reporting, similar to the EU's proposed Coordinating Authorities.
Why Relevant: The bill includes age-based gatekeeping and disclosure requirements.
Mechanism of Influence: Operators must disclose to users that chatbots 'may not be suitable for some minors' (Sec. 4) and provide specific notifications for users 'the operator knows is a minor' (Sec. 2(3)).
Evidence:
Ambiguity Notes: The bill relies on the operator 'knowing' a user is a minor but does not explicitly mandate a specific age-verification technology, though Sec. 4 implies a general duty of care regarding age suitability.
Legislation ID: 112488
Bill URL: View Bill
This bill introduces regulations concerning the publication and distribution of materials harmful to minors on the Internet. It mandates that business entities implement reasonable age verification methods before allowing access to such materials. The bill defines material harmful to minors and obscene material, and establishes penalties for violations, including civil liability. It also includes exemptions for bona fide news organizations and protects Internet service providers from liability under certain conditions.
| Date | Action |
|---|---|
| 2026-01-07 | Senate Amendment 2 offered by Senator Wanggaard |
| 2025-12-03 | Senate Amendment 1 offered by Senator Wanggaard |
| 2025-11-12 | Available for scheduling |
| 2025-11-12 | Executive action taken |
| 2025-11-12 | Report concurrence recommended by Committee on Mental Health, Substance Abuse Prevention, Children and Families, Ayes 3, Noes 2 |
| 2025-10-08 | Public hearing held |
| 2025-03-21 | Read first time and referred to committee on Mental Health, Substance Abuse Prevention, Children and Families |
| 2025-03-20 | Assembly Amendment 1 adopted |
Why Relevant: The bill mandates age verification for specific online platforms, a core component of the EU Chat Act's safety framework.
Mechanism of Influence: It requires business entities to use government IDs or transactional data to 'reliably identify child users' and gate access to content deemed harmful.
Evidence:
Ambiguity Notes: The definition of 'commercially reasonable method' for transactional data is not strictly defined, potentially allowing for various third-party verification technologies.
Why Relevant: The bill includes a specific enforcement mechanism to prevent the bypass of age-verification controls.
Mechanism of Influence: It compels platforms to implement technical blocking of VPN IP addresses, which acts as a form of access control and infrastructure mandate.
Evidence:
Ambiguity Notes: None
Why Relevant: The bill addresses data preservation and privacy by limiting the retention of identity data used for verification.
Mechanism of Influence: It creates a statutory duty to delete identifying information immediately after the access decision, impacting how compliance infrastructure is built.
Evidence:
Ambiguity Notes: None
Legislation ID: 231487
Bill URL: View Bill
This bill introduces regulations concerning social media platforms treatment of minors, defining a minor as anyone under 18 years of age. It prohibits platforms from collecting or using data related to minors, restricts targeted advertising, and mandates age verification methods. Additionally, it establishes enforcement mechanisms through the Department of Justice, including penalties for violations and a platform for public complaints.
| Date | Action |
|---|---|
| 2025-12-12 | Introduced by Senators Roys, Wall, Pfaff, L. Johnson, Keyeski, Wanggaard, Wirch and Dassler-Alfheim; cosponsored by Representatives McGuire, Fitzgerald, McCarville, Miresse, Palmeri, Prado, Sinicki, Snodgrass, Stroud, Stubbs, Tenorio and Udell |
| 2025-12-12 | Read first time and referred to Committee on Utilities, Technology and Tourism |
Why Relevant: The bill mandates age verification for all users to identify minors, a core component of the EU Chat Act's compliance infrastructure.
Mechanism of Influence: Platforms must implement a 'reliable, industry-accepted method approved by the department of justice' to determine user age, effectively requiring identity or age-gating for access.
Evidence:
Ambiguity Notes: The term 'reliable, industry-accepted method' is not defined, leaving the specific technology (e.g., biometric estimation, ID upload) to DOJ discretion.
Why Relevant: The bill mandates mitigation changes to recommender systems and product features.
Mechanism of Influence: It prohibits platforms from using personal data to 'select, prioritize, or deprioritize' content for minors, forcing a move away from engagement-based algorithms to chronological or search-based feeds.
Evidence:
Ambiguity Notes: While aimed at mental health, this mirrors the EU Act's 'mitigation' requirements regarding how algorithms surface content to children.
Why Relevant: The bill's scope explicitly includes private messaging services.
Mechanism of Influence: By defining social media platforms to include those where private messaging is a 'significant part,' the bill's data and age-verification mandates apply to interpersonal communication services.
Evidence:
Ambiguity Notes: The phrase 'significant part' is subjective and could capture a wide range of encrypted and unencrypted messaging apps.