Well-intentioned U.S. online-safety proposals increasingly require platforms to prove user age, vet content at scale, and retain sensitive artifacts. Recent events in the UK show how such mandates can inadvertently manufacture high-value targets—centralized stores of identity documents and support-desk integrations—creating systemic risk even when a platform’s core production systems are untouched. See the UK regulator’s timeline and duties entering force in July 2025 (Ofcom quick guide; Ofcom statement; Ofcom age-checks in force; UK government explainer).
Consider the October 2025 incident involving Discord’s official disclosure: the company says a third-party vendor (identified as 5CA in its press note) was compromised—not Discord’s own systems— and that approximately 70,000 users’ government-ID photos (used for age-appeal reviews) may have been exposed. Discord disputes threat-actor claims of “millions” of ID images and refused to pay ransom (BleepingComputer; Tom’s Hardware; The Verge (initial coverage)). Reporting also notes the support stack involved Zendesk, through which ticket data and attachments were allegedly exfiltrated (The Verge; SOCRadar).
The situation remains contested: after Discord named 5CA, the vendor publicly pushed back and suggested human error rather than a direct 5CA breach, underscoring the complexity of outsourced compliance chains (The Verge follow-up; SecurityWeek). Regardless of disputed counts (hackers claimed multi-terabyte theft and 5.5M affected), the core operational lesson is clear: age-verification workflows concentrate extremely sensitive documents (government IDs, selfies) in downstream support and KYC ecosystems—systems that often have broader access and weaker segmentation than primary production environments (The Guardian).
Bottom line: Child-protection goals are vital. But without careful legislative design—data minimization, in-product age signals over document uploads, strict vendor segmentation and least-privilege access, short retention, and verifiable privacy guarantees— well-meaning statutes may manufacture the very catastrophes they aim to prevent.
Further reading: Discord press statement · BleepingComputer coverage · The Verge · Tom’s Hardware · SOCRadar analysis · The Guardian (age-verification context) · Ofcom: age checks in force · Ofcom: children’s codes · UK gov explainer · SecurityWeek: 5CA response
Unlike previous blog posts, this report uses our similarity search engine instead of our concept search engine. In order to perform this search, I copied the text of the second chapter of the EU's Chat Control Act and provided a filtering prompt.
Role: You are a legislative analyst. Your job is to evaluate a single U.S. bill (federal or state) to determine whether it is a "spiritual successor" to the EU Chat Act—that is, whether it meaningfully replicates its risk-assessment, mitigation, compelled detection/scanning, age-verification, app-store gatekeeping, reporting, removal/blocking, data-preservation, and compliance-infrastructure features aimed at preventing child sexual abuse online, with material implications for privacy, encryption, and user rights.
Guidelines: Work strictly from the provided bill text. Be conservative; prefer explicit statutory language over inferences. Quote only short, necessary phrases with section cites.
Mark YES/LIKELY when the bill includes one or more of the following core elements (strong signals), especially in combination:
Mark NO/UNLIKELY when the bill is only about:
"risk assessment," "risk mitigation," "implementation plan," "coordinating authority," "trusted flagger," "detection order," "install technologies," "indicators," "hash," "solicitation of children," "age verification/assessment," "reliably identify child users," "app store," "interpersonal communication service," "private communications," "reporting obligations," "template," "central clearinghouse/center," "removal order," "blocking order," "URL list," "data preservation," "user redress," "point of contact," "legal representative," "end-to-end encryption," "client-side scanning," "least intrusive," "error rate," "state of the art."
I limited the search results to the first 120 results per state and dropped all results that were labeled as "false" by the filtering prompt. The net for this search is already quite large, and therefore marginal results are included in this report.
Legislation ID: 18639
Bill URL: View Bill
HB276 mandates social media platforms to implement various measures to protect minors under 18 years of age. This includes displaying notifications for minor users, prohibiting direct messaging from adults unless connected, implementing age verification processes, providing parental controls, and responding to inquiries from the Attorney General. The bill also establishes penalties for non-compliance and requires the State Department of Education to develop guidelines related to the mental and physical health impacts of social media use by minors.
Date | Action |
---|---|
2025-02-13 | Pending Committee Action in House of Origin |
Why Relevant: Explicitly mandates 'age verification process' for all users, requiring platforms to verify user ages.
Mechanism of Influence: This compels platforms to implement mechanisms to reliably identify minor users and gate access based on age.
Evidence:
Ambiguity Notes: The bill specifies 'commercially reasonable' age verification but does not detail technical standards or third-party checks.
Why Relevant: Requires parental controls and tools for parents to manage minor accounts, including deletion and privacy settings.
Mechanism of Influence: Mandates platforms provide infrastructure for parental oversight, account deletion, and restriction of features for minors.
Evidence:
Ambiguity Notes: Scope is broad, but does not specify technical requirements for these controls.
Why Relevant: Prohibits direct messaging from adults to minors unless already connected, and restricts collection and use of minors' personal information.
Mechanism of Influence: Requires platforms to moderate interpersonal communications and limit data processing for minors.
Evidence:
Ambiguity Notes: Does not specify if this entails proactive scanning or detection of adult-minor messaging attempts.
Legislation ID: 18676
Bill URL: View Bill
This bill mandates app store providers to verify the age of users, affiliate minor accounts with parent accounts, and obtain parental consent before allowing minors to access certain apps. It also requires notification of significant app changes, real-time access to age category data for developers, and protection of personal age verification data. Furthermore, it prohibits app store providers and developers from enforcing contracts against minors without parental consent and misrepresenting information. The Attorney General is authorized to enforce compliance and impose penalties for violations.
Date | Action |
---|---|
2025-02-20 | Pending Committee Action in House of Origin |
Why Relevant: The bill explicitly mandates that app store providers must verify the age of users and obtain verifiable parental consent for minors before allowing app downloads or purchases. This is a direct form of age verification and gating access for minors—one of the core elements of the EU Chat Act.
Mechanism of Influence: App stores are required to implement robust age verification systems and link minor accounts to parent accounts, effectively preventing unverified minors from accessing certain apps and providing a compliance mechanism for parental oversight.
Evidence:
Ambiguity Notes: The bill references 'commercially available methods' or 'compliant age verification systems' but does not specify the exact technologies or thresholds, leaving room for interpretation regarding the strength and intrusiveness of the verification.
Why Relevant: The bill requires real-time access to age category data and parental consent status for developers. This infrastructure supports ongoing compliance and enforcement, another element seen in the EU Chat Act's compliance infrastructure requirements.
Mechanism of Influence: Developers can programmatically restrict or tailor content and features based on verified age and parental consent status, which aligns with risk mitigation and compliance infrastructure.
Evidence:
Ambiguity Notes: The bill does not specify additional risk assessment or mitigation obligations beyond age gating and parental consent.
Why Relevant: The Attorney General is given explicit authority to adopt rules for age verification and enforce compliance, including penalties. This central enforcement authority is similar to the centralized enforcement seen in the EU Chat Act.
Mechanism of Influence: Creates an enforcement regime with the power to impose penalties and adopt further rules, ensuring compliance and deterrence.
Evidence:
Ambiguity Notes: The scope of the Attorney General's rulemaking is broad but not detailed in the text provided.
Why Relevant: The bill includes data protection requirements for age verification data, mandating minimal collection and secure transmission. While not a direct analogue to the EU Chat Act's data preservation requirements, it is relevant to the infrastructure and privacy implications.
Mechanism of Influence: Limits the collection and sharing of sensitive age verification data, requiring encryption and restricting use to compliance and safety.
Evidence:
Ambiguity Notes: Does not require preservation or reporting of content/metadata related to abuse or CSAM; focus is on data minimization and security.
Legislation ID: 18677
Bill URL: View Bill
This bill mandates that all smartphones and tablets manufactured after January 1, 2027, must include a filter that restricts access to obscene material for users identified as minors. The filter must be enabled during device activation and can only be deactivated or reactivated with a password. Manufacturers who fail to comply will face civil liability, and parents or guardians can take legal action if a minor accesses inappropriate content due to non-compliance.
Date | Action |
---|---|
2025-02-20 | Pending Committee Action in House of Origin |
Why Relevant: The bill imposes a duty on device manufacturers to implement age-based filtering and age determination during device activation, which touches on age verification and mandatory filtering—features relevant to the EU Chat Act's child protection aims.
Mechanism of Influence: Requires age determination ('Devices must determine the age of the user during activation') and mandates that filters be 'set to on for minor users,' with password controls for deactivation/reactivation. This creates a statutory obligation for device-level content restriction based on age.
Evidence:
Ambiguity Notes: The bill does not specify the technical method for age determination, nor does it define the scope or type of 'filter' beyond blocking 'obscene material.' There is no mention of ongoing risk assessment, content scanning, reporting, or moderation requirements for platforms/services.
Legislation ID: 171204
Bill URL: View Bill
This bill seeks to prohibit Internet service providers, cell phone service providers, and content delivery networks from facilitating access to abortion producing drugs within Alabama. It establishes penalties for violations, including felonies for service providers who allow such access, and requires the Attorney General to monitor and enforce compliance. Additionally, it allows for civil action against those involved in the provision of abortion producing drugs, and allocates fines collected to support pregnancy resource centers.
Date | Action |
---|---|
2025-04-24 | Pending Committee Action in House of Origin |
2025-04-24 | Read for the first time and referred to the House Committee on Judiciary |
Why Relevant: The bill requires ISPs and similar providers to block access to domain names and IP addresses associated with abortion drugs, similar to removal/blocking orders seen in the EU Chat Act.
Mechanism of Influence: Service providers must implement technical measures to restrict access to specific internet resources, under threat of felony penalties.
Evidence:
Ambiguity Notes: While the bill specifies blocking domain names and IPs, it does not mention automated detection, risk assessments, or scanning technologies.
Why Relevant: Empowers the Attorney General to monitor the internet for violations and maintain enforcement.
Mechanism of Influence: Creates a government enforcement role similar to a coordinating or monitoring authority.
Evidence:
Ambiguity Notes: Does not specify technical standards or proactive provider obligations beyond blocking.
Why Relevant: Establishes criminal penalties for providers who fail to block prohibited content.
Mechanism of Influence: Creates strong incentives for compliance through felony charges.
Evidence:
Ambiguity Notes: Penalties are for noncompliance with blocking, not for failing to proactively detect.
Why Relevant: Provides immunity for providers who take action to restrict access.
Mechanism of Influence: Encourages over-blocking to avoid liability.
Evidence:
Ambiguity Notes: No details on required technical measures or error rates.
Legislation ID: 18832
Bill URL: View Bill
SB186 requires manufacturers of smartphones and tablets activated in Alabama to include filters that block access to obscene content. The bill outlines the responsibilities of manufacturers and retailers, establishes civil liabilities for violations, and allows parents or guardians to take legal action if minors access prohibited content. It seeks to safeguard minors from inappropriate online material through the use of technology.
Date | Action |
---|---|
2025-05-14 | Enacted |
2025-05-01 | Chambliss Concur In and Adopt - Adopted Roll Call 905 |
2025-05-01 | Signature Requested |
2025-05-01 | Delivered to Governor |
2025-05-01 | Enrolled |
2025-05-01 | Ready to Enroll |
2025-04-29 | Motion to Adopt - Adopted Roll Call 1098 |
2025-04-29 | Commerce and Small Business Engrossed Substitute Offered |
Why Relevant: The bill mandates age verification at device activation and requires automatic enabling of content filters for minors.
Mechanism of Influence: Manufacturers must implement age checks and ensure filters are enabled for minors, directly impacting device setup and user privacy.
Evidence:
Ambiguity Notes: The age verification method is not specified—could range from self-attestation to more intrusive checks. 'Obscene content' definition may not fully overlap with CSAM but could be interpreted broadly.
Why Relevant: The bill requires persistent content filtering technology on all covered devices, which must block access to certain material for minors.
Mechanism of Influence: This introduces a technical obligation for device manufacturers to pre-install and maintain filter software, potentially affecting device functionality and user rights.
Evidence:
Ambiguity Notes: No explicit requirement for scanning user communications or files—filtering is for web/content access, not interpersonal messaging or private comms.
Why Relevant: The bill includes civil liability for manufacturers and others who disable the filter, if a minor accesses obscene content as a result.
Mechanism of Influence: Creates strong incentives for manufacturers and third parties to ensure filters are not disabled for minors, and for robust age verification.
Evidence:
Ambiguity Notes: Liability attaches only if disabling is 'intentional' and leads to access; not clear how 'intentional' is proved.
Why Relevant: The Attorney General is given enforcement powers, including investigation and civil penalties.
Mechanism of Influence: Centralized enforcement and penalties may drive compliance and create a regulatory infrastructure.
Evidence:
Ambiguity Notes: No mention of reporting obligations, centralized clearinghouse, or data preservation.
Legislation ID: 18833
Bill URL: View Bill
SB187 establishes a framework for app store providers to verify the age of users, particularly minors, and to manage parental consent requirements for app downloads and purchases. It mandates that app store providers protect personal age verification data and prohibits them from enforcing contracts against minors without proper consent. The bill also empowers the Attorney General to enforce compliance and address violations as deceptive trade practices.
Date | Action |
---|---|
2025-05-01 | Re-referred to Committee in Second House |
2025-04-22 | Pending Committee Action in Second House |
2025-04-22 | Read for the first time and referred to the House Committee on Children and Senior Advocacy |
2025-04-17 | Engrossed |
2025-04-17 | Chambliss motion to Adopt - Adopted Roll Call 748 |
2025-04-17 | Children and Youth Health 1st Amendment Offered |
2025-04-17 | Third Reading in House of Origin |
2025-04-17 | Petition to Cease Debate - Adopted Roll Call 747 |
Why Relevant: The bill requires app store providers to implement robust age verification for account creation and to ensure minors cannot access apps without parental consent. These are strong age verification and app-store gatekeeping provisions.
Mechanism of Influence: App store providers must 'request and verify the age of individuals creating accounts' and 'link the minor’s account to a parent account and obtain verifiable parental consent before allowing app downloads or purchases.' (Account Creation Requirements; Minor Account Requirements).
Evidence:
Ambiguity Notes: The bill does not specify technical standards for 'reliable methods' of age verification, which could range from self-declaration to biometric or ID checks.
Why Relevant: The bill mandates that app stores provide developers with real-time access to user age category data and consent status, enabling developers to comply with minor-specific requirements.
Mechanism of Influence: Developers must use app store-supplied age/consent status to verify user age and limit app functionality for minors without parental consent.
Evidence:
Ambiguity Notes: The scope of developer obligations is contingent on the accuracy and reliability of app store age/consent data.
Why Relevant: Empowers the Attorney General to enforce the act and set rules for verifying minor accounts, adding a compliance infrastructure component.
Mechanism of Influence: The Attorney General may issue rules regarding verification and can bring enforcement actions as deceptive trade practices.
Evidence:
Ambiguity Notes: The rulemaking authority could be used to specify technical details or expand compliance burdens, but this is not explicit in the bill.
Why Relevant: The bill imposes data protection requirements for age verification processes, including encryption and data minimization.
Mechanism of Influence: App stores must 'limit data collection for age verification and use encryption for data transmission.'
Evidence:
Ambiguity Notes: No specifics on encryption standards or retention periods are provided.
House Bill No. 46, known as the App Store Accountability Act, mandates that app store providers verify the age of users and obtain verifiable parental consent before allowing minors to download or purchase apps. It outlines the responsibilities of both app store providers and developers in protecting minors and ensuring transparency regarding app content ratings. The bill also establishes enforcement mechanisms, including penalties for non-compliance and the creation of an advisory committee to improve app age rating transparency.
Date | Action |
---|---|
2025-01-22 | (H) PREFILE RELEASED 1/17/25 |
Why Relevant: The bill mandates app store providers to verify age and obtain parental consent for minors, directly referencing 'age verification' and 'parental consent.'
Mechanism of Influence: App stores must implement systems to reliably determine user age and secure parental approval before allowing minors to access apps, potentially requiring technical infrastructure for age checks.
Evidence:
Ambiguity Notes: The bill does not specify the technical method for age verification, leaving room for various interpretations (e.g., self-attestation vs. robust identity checks).
Why Relevant: The Act imposes duties on app store providers and developers to display age ratings and content descriptions, and to use APIs for age/consent checks.
Mechanism of Influence: Developers must interact with app store APIs to enforce age gating and consent, integrating compliance into the app distribution process.
Evidence:
Ambiguity Notes: No explicit mention of risk assessments, mitigation plans, or compelled detection/scanning.
Legislation ID: 157547
Bill URL: View Bill
HB2112 amends Title 18 of the Arizona Revised Statutes by adding a new chapter focused on government information technology use, specifically addressing the issue of internet pornography and minors. The bill mandates that commercial entities verify the age of individuals accessing sexual material that may be harmful to minors, prohibits the retention of identifying information, and outlines penalties for violations, thereby aiming to protect minors from inappropriate content online.
Date | Action |
---|---|
2025-05-07 | House Maj Caucus Date2 |
2025-05-07 | House Min Caucus Date2 |
2025-03-18 | Senate Min Caucus Date |
2025-03-18 | Senate Consent Object Date |
2025-03-18 | Senate Maj Caucus Date |
2025-03-17 | Senate Consent Calendar Date |
2025-02-26 | Senate2nd Read |
2025-02-25 | Senate1st Read |
Why Relevant: The bill mandates age verification for access to certain online content, one of the core elements of the EU Chat Act.
Mechanism of Influence: Commercial entities must verify user age using government-issued ID or commercially reasonable methods before granting access to sexual material harmful to minors. This directly imposes a duty on online platforms to gate access based on age.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for 'commercially reasonable' verification, leaving room for interpretation and potentially variable enforcement. It does not require scanning, risk assessments, or moderation changes beyond age gating.
Legislation ID: 158737
Bill URL: View Bill
SB1341 introduces a new chapter to Title 18 of the Arizona Revised Statutes, focusing on the responsibilities of commercial entities that publish or distribute material harmful to minors. It mandates reasonable age verification methods to prevent minors from accessing such content and outlines the civil liabilities for failure to comply, including damages for minors accessing harmful material. The bill also clarifies the definitions of harmful material and the scope of commercial entities responsibilities.
Date | Action |
---|---|
2025-02-03 | Senate2nd Read |
2025-01-30 | Senate1st Read |
Why Relevant: The bill mandates age verification for access to certain online content, a core element of the EU Chat Act's approach to preventing child harm online.
Mechanism of Influence: Requires commercial entities to verify the age of all users seeking access to material harmful to minors, imposing a legal duty to gate access based on age.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification, leaving 'reasonable age verification method' open to interpretation. It does not mandate specific technologies (e.g., biometric, ID upload), nor does it reference proportionality, error rates, or 'least intrusive' means.
Legislation ID: 35540
Bill URL: View Bill
The Arkansas Kids Online Safety Act aims to create a framework for online platforms to safeguard minors from potential harms associated with internet usage. This includes provisions for parental controls, limitations on advertising, and measures to prevent compulsive usage and mental health issues. The act outlines the responsibilities of covered platforms in protecting minors and providing tools for parents to manage their childrens online experiences.
Date | Action |
---|---|
2025-03-11 | WITHDRAWN BY AUTHOR |
2025-02-27 | Placed on second reading for the purpose of amendment |
2025-02-27 | REPORTED CORRECTLY ENGROSSED |
2025-02-27 | Amendment No |
2025-02-26 | Placed on second reading for the purpose of amendment |
2025-02-26 | REPORTED CORRECTLY ENGROSSED |
2025-02-26 | Amendment No |
2025-02-12 | Placed on second reading for the purpose of amendment |
Why Relevant: The bill requires platforms to take 'reasonable measures' to prevent harm to minors and avoid 'heightened risks' (see Duty of care — Prevention of harm to minors).
Mechanism of Influence: Mandates a general duty of care, default safety tools for known minors, and risk mitigation through platform design and controls.
Evidence:
Ambiguity Notes: The duty of care is broad and does not specify risk assessment documentation or ongoing mitigation planning as in the EU Chat Act.
Legislation ID: 80573
Bill URL: View Bill
This bill introduces the Arkansas Kids Online Safety Act, which mandates that online platforms implement specific safeguards to protect minors from various online risks, including mental health issues, compulsive usage, and exposure to harmful content. It outlines definitions related to minors and online platforms, establishes a duty of care for covered platforms, and specifies parental tools and reporting mechanisms to enhance the safety of minors online.
Date | Action |
---|---|
2025-05-05 | Died in Senate Committee at Sine Die adjournment |
2025-03-17 | Received from the House |
2025-03-17 | Read the third time and passed and ordered transmitted to the Senate |
2025-03-17 | Read first time |
2025-03-13 | Placed on second reading for the purpose of amendment |
2025-03-13 | Amendment No |
2025-03-13 | REPORTED CORRECTLY ENGROSSED |
2025-03-12 | Returned by the Committee with the recommendation that it do pass as amended 1 |
Why Relevant: The bill imposes a platform duty of care to prevent harm to minors, which is a core theme of the EU Chat Act. It also mandates parental controls and reporting mechanisms.
Mechanism of Influence: Platforms must implement features and policies designed to mitigate risks to minors, such as mental health harms and compulsive use, and must provide tools for parental oversight and reporting harms.
Evidence:
Ambiguity Notes: 'Reasonable measures' and 'safeguards' are broad and subject to interpretation. The bill does not specify technical means or require risk assessment documentation, detection technologies, or age verification systems.
Legislation ID: 109916
Bill URL: View Bill
The Child Content Creation Protection Act is designed to regulate the involvement of minors in content creation on social media platforms. It sets forth definitions, requirements for compensation, record-keeping, privacy rights, and the prohibition of the sexualization of minors in content. The act also mandates social media platforms to provide mechanisms for privacy removal requests and to implement strategies to mitigate risks associated with the exploitation of minors in content creation.
Date | Action |
---|---|
2025-04-22 | Notification that HB1975 is now Act 982 |
2025-04-16 | Correctly enrolled and ordered transmitted to the Governor |
2025-04-16 | TO BE ENROLLED |
2025-04-16 | Returned to the House as passed |
2025-04-16 | Read the third time and passed |
2025-04-16 | Returned from the Senate as passed |
2025-04-14 | Returned by the Committee |
2025-04-10 | Read first time |
Why Relevant: This section requires platforms to develop and regularly reassess risk mitigation strategies for monetized content featuring minors.
Mechanism of Influence: Mandates ongoing risk assessment and documentation by platforms, specifically targeting risks to minors in monetized content.
Evidence:
Ambiguity Notes: The scope of 'risk mitigation strategy' is not fully detailed; it is unclear if this includes technical changes or only policy updates.
Why Relevant: Allows minors or guardians to request removal of content and obligates platforms and creators to comply.
Mechanism of Influence: Creates a takedown/removal mechanism for content featuring minors, with deadlines for compliance.
Evidence:
Ambiguity Notes: Limited to content featuring the requesting minor; not a general removal/blocking order for illegal content.
This bill amends the Social Media Safety Act to introduce new definitions, requirements, and penalties aimed at protecting minors from harmful online content and addictive behaviors. It establishes guidelines for social media companies on how to manage user accounts, particularly for minors, and mandates the implementation of technological measures to prevent circumvention of age verification protocols. The bill also creates the Crimes Against Children Fund to support initiatives aimed at child protection.
Date | Action |
---|---|
2025-04-22 | Notification that SB611 is now Act 900 |
2025-04-16 | Reported correctly enrolled and ordered delivered to the Governor |
2025-04-16 | DELIVERED TO GOVERNOR |
2025-04-15 | Rules suspended |
2025-04-15 | Placed on second reading for purpose of amendment |
2025-04-15 | Read the third time and passed |
2025-04-15 | House Amendment No |
2025-04-15 | ORDERED ENROLLED |
Why Relevant: The bill imposes a duty on social media platforms to implement 'reasonable age verification' and obtain parental consent for minors, and to prevent circumvention of these measures.
Mechanism of Influence: Platforms must reliably identify child users and block access to minors without parental consent, using age verification protocols and anti-circumvention measures.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification (e.g., biometric checks, document upload), leaving implementation details open.
Why Relevant: Requires platforms to 'conduct quarterly audits to assess software impact on minors' behavior' and to implement addiction-mitigation features.
Mechanism of Influence: Mandates ongoing risk assessment and (implicitly) mitigation regarding minors' compulsive use, with reporting and parental dashboard requirements.
Evidence:
Ambiguity Notes: Focus is on addiction/compulsion rather than CSAM, but the audit/mitigation structure mirrors risk assessment obligations.
Why Relevant: Enables enforcement actions and civil penalties for non-compliance, including daily penalties per violation.
Mechanism of Influence: Provides strong incentives for platforms to comply with age verification and parental consent requirements.
Evidence:
Ambiguity Notes: No explicit mention of reporting obligations to a central authority or data preservation.
Legislation ID: 25064
Bill URL: View Bill
The Leading Ethical AI Development (LEAD) for Kids Act requires developers of generative AI systems with significant user bases to provide AI detection tools for users. It establishes the LEAD for Kids Standards Board to regulate AI products aimed at children, requiring risk assessments and compliance with safety standards. The board will classify AI products based on their risk levels and enforce penalties for violations, creating a fund for administering the act.
Date | Action |
---|---|
2025-08-18 | In committee: Referred to suspense file. |
2025-07-17 | Read second time and amended. Re-referred to Com. on APPR. |
2025-07-16 | From committee: Amend, and do pass as amended and re-refer to Com. on APPR. (Ayes 11. Noes 1.) (July 15). |
2025-06-11 | Referred to Com. on JUD. |
2025-06-03 | In Senate. Read first time. To Com. on RLS. for assignment. |
2025-06-02 | Read third time. Passed. Ordered to the Senate. (Ayes 59. Noes 12.) |
2025-05-27 | Read second time. Ordered to third reading. |
2025-05-23 | Read second time and amended. Ordered returned to second reading. |
Why Relevant: The Act requires developers to conduct and submit risk assessments for AI products, particularly those classified as high-risk, and to implement mitigation procedures.
Mechanism of Influence: Developers must register products and submit risk assessments by a set date; high-risk products require pre- and post-deployment assessments and independent audits. The board will adopt risk classification and mitigation criteria.
Evidence:
Ambiguity Notes: The term 'risk assessment' is used, but there are no detailed statutory definitions of what constitutes an adequate assessment or mitigation plan. The board is tasked with further defining these requirements via regulation.
Why Relevant: The Act mandates deployers to implement procedures to prevent children from accessing prohibited/high-risk AI products.
Mechanism of Influence: Deployers must ensure children cannot access certain products, and must revoke licenses if misuse is detected.
Evidence:
Ambiguity Notes: The nature of the procedures (e.g., age verification, technical blocks) is not specified in the bill text; details are likely to be defined by board regulation.
Why Relevant: The Act requires independent third-party audits for covered products based on risk level.
Mechanism of Influence: Developers must submit products and documentation for audit, allowing for oversight and compliance checks.
Evidence:
Ambiguity Notes: Audit scope and standards are not described in the bill text.
Why Relevant: The Act creates a regulatory board with authority to adopt regulations, classify risk, and enforce compliance.
Mechanism of Influence: The board sets standards, oversees compliance, and refers violations for enforcement.
Evidence:
Ambiguity Notes: Board powers are broad but not explicitly tied to content scanning, detection orders, or encryption.
Why Relevant: The Act requires incident reporting and public display of compliance-related information.
Mechanism of Influence: Deployers must report incidents and display license requirements, increasing transparency.
Evidence:
Ambiguity Notes: No detail on the scope or content of required incident reports.
Legislation ID: 25832
Bill URL: View Bill
Senate Bill No. 468 introduces regulations for high-risk artificial intelligence systems that process personal information, mandating covered deployers to implement comprehensive information security programs. The bill outlines specific requirements for safeguarding personal information and establishes violations as deceptive trade acts under existing law. It empowers the California Privacy Protection Agency to regulate and enforce these provisions, furthering the goals of the California Privacy Rights Act of 2020.
Date | Action |
---|---|
2025-05-23 | May 23 hearing: Held in committee and under submission. |
2025-05-16 | Set for hearing May 23. |
2025-05-05 | May 5 hearing: Placed on APPR. suspense file. |
2025-04-25 | Set for hearing May 5. |
2025-04-23 | From committee: Do pass and re-refer to Com. on APPR. (Ayes 11. Noes 0. Page 835.) (April 22). Re-referred to Com. on APPR. |
2025-04-23 | From committee: Do pass and re-refer to Com. on APPR. (Ayes 11. Noes 0.) (April 22). Re-referred to Com. on APPR. |
2025-03-25 | Set for hearing April 22. |
2025-02-26 | Referred to Com. on JUD. |
Why Relevant: The bill requires ongoing risk assessments and mitigation measures for high-risk AI systems that handle personal information, which is conceptually similar to the risk assessment and mitigation requirements in the EU Chat Act.
Mechanism of Influence: Covered deployers must regularly assess risks and improve safeguards for personal information, which could include changes to how AI systems process or secure data, but the bill does not specify platform moderation, recommender systems, or user-facing features.
Evidence:
Ambiguity Notes: The bill uses broad language regarding 'risk assessment' and 'comprehensive information security program,' but does not explicitly define scope or methods for assessment, nor does it mention user-generated content or online communication services.
Why Relevant: The bill requires data security and internal control measures, including employee designation, training, and disciplinary measures, which are adjacent to but not identical to the data preservation and internal controls seen in the EU Chat Act.
Mechanism of Influence: Mandates internal oversight and employee training for security, but does not specify preservation of user content/metadata tied to detection or complaints.
Evidence:
Ambiguity Notes: Focus is on internal business practices for security, not on investigatory preservation or user redress mechanisms.
Legislation ID: 34232
Bill URL: View Bill
This bill mandates social media companies to implement specific tools and settings designed to protect minor users in Colorado. It includes requirements for age assurance systems, user control settings, parental support tools, and measures to enhance privacy and security for minors. The bill also prohibits practices that undermine user autonomy and defines certain engagement features as high-risk for minors, subjecting them to additional scrutiny. Furthermore, it empowers the attorney general to create rules for enforcement.
Date | Action |
---|---|
2025-05-13 | House Committee on Appropriations Lay Over Unamended - Amendment(s) Failed |
2025-04-02 | House Committee on Health & Human Services Refer Amended to Appropriations |
2025-02-26 | Introduced In House - Assigned to Health & Human Services |
Why Relevant: The bill explicitly requires social media companies to implement an 'age assurance system to identify minor users accurately.'
Mechanism of Influence: This creates a binding duty for platforms to assess or verify the age of users, likely requiring technical means such as age estimation or verification for all or most users.
Evidence:
Ambiguity Notes: The term 'age assurance system' is not further defined in the abstract, so the technical intrusiveness (e.g., ID upload, biometric analysis) is unclear.
Why Relevant: The bill mandates 'tools for minors to control their social media experience' and 'tools for parents to assist minor users,' with minimum capabilities specified.
Mechanism of Influence: This imposes a duty on platforms to build and deploy technical controls, likely affecting product design and user privacy.
Evidence:
Ambiguity Notes: Details on the minimum capabilities and how invasive these tools are (e.g., surveillance, reporting to parents) are not specified in the abstract.
Why Relevant: The bill prohibits designs that 'undermine user autonomy or encourage minors to forgo privacy protections.'
Mechanism of Influence: This creates a legal standard for product design, potentially requiring platforms to audit and alter engagement features, privacy defaults, or nudges.
Evidence:
Ambiguity Notes: What counts as 'undermining autonomy' or 'encouraging forgoing privacy' is open to interpretation and could be defined via rulemaking.
Why Relevant: The Attorney General is empowered to 'create rules for enforcement.'
Mechanism of Influence: This allows for additional, potentially more intrusive requirements (such as reporting or compliance infrastructure) to be added by regulation.
Evidence:
Ambiguity Notes: The scope of rulemaking is not detailed; could be used to impose further platform duties.
Legislation ID: 174787
Bill URL: View Bill
This bill mandates that certain internet platforms, defined as covered platforms, must verify the age of users attempting to access material harmful to children starting July 1, 2026. It requires these platforms to employ reasonable age verification measures, prevent access to minors, conduct annual audits, and respect user privacy. The legislation highlights the need for effective protections against the potential negative impacts of early exposure to explicit content on minors mental health and development.
Date | Action |
---|---|
2025-04-14 | Senate Second Reading Laid Over to 05/08/2025 - No Amendments |
2025-04-10 | Senate Second Reading Laid Over to 04/14/2025 - No Amendments |
2025-04-04 | Senate Second Reading Laid Over to 04/10/2025 - No Amendments |
2025-04-01 | Senate Second Reading Laid Over to 04/04/2025 - No Amendments |
2025-03-27 | Senate Committee on Health & Human Services Refer Amended to Senate Committee of the Whole |
2025-03-20 | Introduced In Senate - Assigned to Health & Human Services |
Why Relevant: The bill explicitly mandates 'reasonable age verification measures' for covered platforms to prevent minors from accessing harmful material.
Mechanism of Influence: Platforms must implement technical systems to verify user age, going beyond simple IP checks, and must block access to minors. They must also conduct annual independent audits and provide an appeals process.
Evidence:
Ambiguity Notes: The bill does not specify exact technologies or methods, only that they must be 'reasonable' and use available technology. 'Harmful material' is not defined in the provided summary.
Why Relevant: The bill requires destruction of personal data used for age verification, addressing privacy concerns.
Mechanism of Influence: Mandates that personal data collected for age verification must be destroyed as soon as reasonably possible, and that platforms comply with the Colorado Privacy Act.
Evidence:
Ambiguity Notes: No detail on enforcement or penalties for failure to destroy data; 'as soon as reasonably possible' is not defined.
Legislation ID: 78092
Bill URL: View Bill
This bill mandates that owners of social media platforms create an online safety center and develop a cyberbullying policy by January 1, 2026. It outlines the necessary resources for users to prevent cyberbullying and access mental health services. Additionally, it redefines heightened risk of harm to minors to include risks to their physical and mental health. It also sets requirements for online services directed at minors, including consent for data processing and safeguards against unsolicited communications.
Date | Action |
---|---|
2025-03-24 | Favorable Report, Tabled for the Calendar, House |
2025-03-24 | House Calendar Number 137 |
2025-03-24 | Reported Out of Legislative Commissioners Office |
2025-03-17 | Referred to Office of Legislative Research and Office of Fiscal Analysis 03/24/25 12:00 PM |
2025-03-10 | Filed with Legislative Commissioners Office |
2025-03-06 | Joint Favorable |
2025-02-25 | Referred to Joint Committee on Children |
2025-02-24 | Drafted by Committee |
Why Relevant: The bill requires controllers (platforms) to conduct data protection assessments and establish mitigation plans if their services pose heightened risks to minors.
Mechanism of Influence: Platforms must formally assess risks to minors and create plans to mitigate them, which is analogous to the EU Chat Act's risk assessment and mitigation requirements.
Evidence:
Ambiguity Notes: The scope of 'risk' is limited to data protection and mental health, and does not explicitly extend to child sexual abuse or solicitation.
Legislation ID: 78476
Bill URL: View Bill
The bill establishes guidelines for social media operators to prevent minors from accessing inappropriate content and to ensure that any recommendations made to minors are done with parental consent. It defines key terms related to minors and social media, outlines the responsibilities of covered operators, and mandates reporting on user data. The bill seeks to enhance the safety of minors online and to protect their privacy.
Date | Action |
---|---|
2025-05-16 | Senate Calendar Number 506 |
2025-05-16 | Favorable Report, Tabled for the Calendar, Senate |
2025-05-14 | House Passed as Amended by House Amendment Schedule A |
2025-05-14 | House Adopted House Amendment Schedule A 8247 |
2025-03-31 | Reported Out of Legislative Commissioners Office |
2025-03-31 | File Number 348 |
2025-03-31 | Favorable Report, Tabled for the Calendar, House |
2025-03-31 | House Calendar Number 230 |
Why Relevant: The bill explicitly requires 'platform operators to verify the age of users before granting access to personalized content,' and to obtain 'parental consent for minors.'
Mechanism of Influence: This introduces a statutory duty for age verification and parental consent, directly targeting under-18 users' access to algorithmic content.
Evidence:
Ambiguity Notes: The bill requires 'commercially reasonable methods' for age verification, which could range from self-declaration to government ID, depending on interpretation.
Why Relevant: Data retention rules are present: 'Information collected for age verification must be deleted immediately after use.'
Mechanism of Influence: While this limits data retention, it also creates a compliance obligation to securely handle and promptly delete sensitive user data.
Evidence:
Ambiguity Notes: Immediate deletion is specified, but enforcement and audit mechanisms are not detailed.
This bill proposes amendments to the general statutes that require social media providers operating in the state to enforce age restrictions, implement content filters for younger users, and accept liability for damages resulting from cyberbullying incidents on their platforms.
Date | Action |
---|---|
2025-01-08 | Referred to Joint Committee on General Law |
Why Relevant: The bill mandates that social media providers implement content filtering technologies specifically for younger users to prevent cyberbullying.
Mechanism of Influence: This creates a duty for providers to develop and deploy technological solutions (filters) targeting certain content for a subset of users (minors), which is a form of risk mitigation, though limited to cyberbullying.
Evidence:
Ambiguity Notes: The scope is limited to cyberbullying and 'younger users'; it does not explicitly require ongoing risk assessments, broader mitigation plans, or cover CSAM/solicitation risks.
Why Relevant: The bill requires social media providers to enforce minimum age requirements and verify user age.
Mechanism of Influence: This is an explicit age verification mandate, requiring providers to reliably identify users' ages as a condition of access.
Evidence:
Ambiguity Notes: The bill does not specify the technological means or reliability standards for age verification, nor does it tie verification to broader risk mitigation or app-store duties.
This bill introduces the Protect Our Children Act, which mandates that manufacturers of tablets and smartphones enable filters to prevent minors from accessing harmful material upon activation of the device in Florida. It establishes liability for manufacturers and individuals who circumvent these filters, and increases penalties for adults who lure children. The bill also defines terms related to harmful content and outlines enforcement measures by the Attorney General.
Date | Action |
---|---|
2025-06-16 | H Died in Industries & Professional Activities Subcommittee |
2025-05-03 | H Indefinitely postponed and withdrawn from consideration |
2025-03-05 | H Referred to Criminal Justice Subcommittee |
2025-03-05 | H Now in Industries & Professional Activities Subcommittee |
2025-03-05 | H Referred to Commerce Committee |
2025-03-05 | H Referred to Industries & Professional Activities Subcommittee |
2025-03-04 | H 1st Reading |
2025-02-28 | H Filed |
Why Relevant: Mandates technology-based filtering at the device level for all new smartphones and tablets activated in Florida, with enforcement and liability provisions.
Mechanism of Influence: Manufacturers must pre-install and auto-enable filters that block access to harmful material for minors, ensuring that devices are 'child-proofed' by default.
Evidence:
Ambiguity Notes: The scope of 'harmful material' is defined, but the technical standards for the filters and their effectiveness are not detailed. No explicit mention of scanning, risk assessments, or ongoing monitoring.
Why Relevant: Imposes liability on individuals who circumvent or disable filters on minors' devices, with exceptions for parents/guardians.
Mechanism of Influence: Creates a deterrent against removing or bypassing the required filters, extending enforcement beyond manufacturers.
Evidence:
Ambiguity Notes: No provisions for ongoing monitoring or reporting of circumvention; enforcement is reactive and based on harm.
Why Relevant: Attorney General is empowered to enforce compliance and seek penalties.
Mechanism of Influence: Centralized state enforcement authority with powers to investigate, enjoin, and penalize non-compliance.
Evidence:
Ambiguity Notes: No mention of reporting obligations or centralized routing of complaints/data.
CS/HB 743 addresses the use of social media by minors in Florida. It requires social media platforms to implement specific measures to protect minors under the age of 16, including allowing parents or guardians to view their childrens messages, terminating accounts of minors without parental consent, and prohibiting access to disappearing or self-destructing messages. The bill aims to enhance the safety of minors on social media platforms.
Date | Action |
---|---|
2025-06-16 | H Died on Second Reading Calendar |
2025-05-03 | H Indefinitely postponed and withdrawn from consideration |
2025-04-24 | H Bill referred to House Calendar |
2025-04-24 | H Added to Second Reading Calendar |
2025-04-23 | H Laid on Table under Rule 7.18(a) |
2025-04-23 | H 1st Reading |
2025-04-23 | H Reported out of Commerce Committee |
2025-04-23 | H CS Filed |
Why Relevant: The bill imposes specific duties on social media platforms regarding account management, data deletion, and parental controls for minors, which are platform obligations.
Mechanism of Influence: Platforms must implement workflows to identify users under 14, allow parents/guardians to terminate accounts, and prevent access to self-destructing messages. They must also ensure data deletion and provide certain information to parents and law enforcement.
Evidence:
Ambiguity Notes: The bill does not specify technical means for age verification or ongoing risk assessment, nor does it mandate scanning, reporting, or broader compliance infrastructure.
Legislation ID: 214951
Bill URL: View Bill
HB 931 establishes specific responsibilities for developers and manufacturers regarding applications and devices likely accessed by children. It mandates age verification processes, parental consent for minors, and requires application stores to implement nondiscriminatory practices. The bill also outlines enforcement mechanisms and specifies limitations on liability for manufacturers who comply with the regulations.
Date | Action |
---|---|
2025-06-16 | H Died in Industries & Professional Activities Subcommittee |
2025-05-03 | H Indefinitely postponed and withdrawn from consideration |
2025-03-04 | H 1st Reading |
2025-03-02 | H Referred to Industries & Professional Activities Subcommittee |
2025-03-02 | H Referred to Commerce Committee |
2025-03-02 | H Now in Industries & Professional Activities Subcommittee |
2025-03-02 | H Referred to Judiciary Committee |
2025-02-24 | H Filed |
Why Relevant: The bill establishes direct obligations for app developers and manufacturers to implement age estimation and verification, and for app stores to gate downloads by minors, closely mirroring the EU Chat Act's age-verification and app-store gatekeeping features.
Mechanism of Influence: Manufacturers must estimate the age of the primary user at device activation, and app stores must obtain parental consent for minors before allowing downloads ("Manufacturers must estimate the age of the primary user at device activation" and "Application stores must obtain parental consent for known users under 16 before allowing downloads"). This creates a system of age gating and parental oversight for children's access to apps.
Evidence:
Ambiguity Notes: The bill does not specify the technical means of age estimation or verification, nor does it mandate specific technologies or standards for reliability. The scope of 'likely to be accessed by children' could be subject to interpretation.
Why Relevant: The bill requires developers to notify app stores if their applications are likely accessed by children and to provide parental controls, echoing risk identification and mitigation planning.
Mechanism of Influence: Developers are responsible for risk assessment (identifying if their app is likely to be accessed by children) and for implementing mitigation measures (parental controls).
Evidence:
Ambiguity Notes: No explicit requirement for formal written risk assessments or ongoing risk mitigation plans as in the EU Chat Act, but the duties are functionally similar.
Legislation ID: 191432
Bill URL: View Bill
This bill establishes requirements for developers and manufacturers regarding the distribution of applications likely to be accessed by children. It mandates that developers assess whether their applications are likely to be accessed by minors and implement parental controls. Manufacturers must also take steps to verify the age of users and ensure that applications in their stores comply with these regulations. The bill outlines enforcement mechanisms, including actions by the Attorney General, and provides a framework for age verification methods.
Date | Action |
---|---|
2025-06-16 | • Died in Commerce and Tourism |
2025-05-03 | • Indefinitely postponed and withdrawn from consideration |
2025-03-20 | • On Committee agenda-- Commerce and Tourism, 03/25/25, 8:30 am, 110 Senate Building --Temporarily Postponed |
2025-03-12 | • On Committee agenda-- Commerce and Tourism, 03/17/25, 1:30 pm, 110 Senate Building --Temporarily Postponed |
2025-03-10 | • Introduced |
2025-03-06 | • Referred to Commerce and Tourism; Judiciary; Rules |
2025-02-26 | • Filed |
Why Relevant: The bill requires developers to assess whether their applications are likely to be accessed by children and to notify application stores, which is a form of risk assessment.
Mechanism of Influence: Developers must proactively determine the likelihood of child access and notify app stores, which could trigger further obligations or controls.
Evidence:
Ambiguity Notes: The risk assessment is limited to determining child accessibility, not broader risks like CSAM or grooming.
Why Relevant: The bill mandates that developers implement parental control features for child users.
Mechanism of Influence: Developers are required to build and maintain parental control systems for apps accessed by children.
Evidence:
Ambiguity Notes: No explicit detail about the nature or scope of parental controls (e.g., content filtering, usage limits).
Why Relevant: Manufacturers must estimate the user’s age at device activation and provide age signals to developers; app stores must obtain parental consent for downloads by users under 16.
Mechanism of Influence: This establishes an age verification/assessment regime for both device activation and app downloads.
Evidence:
Ambiguity Notes: It is not specified how robust or reliable the age estimation and verification must be.
Why Relevant: The bill imposes obligations on application stores to ensure compliance with age verification.
Mechanism of Influence: Application stores act as gatekeepers, controlling which apps can be accessed by minors based on developer notifications and parental consent.
Evidence:
Ambiguity Notes: No explicit reference to risk assessment by app stores or blocking/removal obligations.
This bill, titled the "Protect Our Children Act," mandates that starting January 1, 2026, all newly manufactured tablets and smartphones in Florida must come equipped with a filter that prevents minors from accessing material deemed harmful. The bill outlines the responsibilities of manufacturers, civil and criminal liabilities for violations, and allows parents or guardians to take legal action against violators. Additionally, it increases penalties for adults who lure or entice minors for unlawful purposes and revises definitions related to sexual offenses.
Date | Action |
---|---|
2025-06-16 | • Died in Criminal Justice |
2025-05-03 | • Indefinitely postponed and withdrawn from consideration |
2025-03-10 | • Introduced |
2025-03-07 | • Referred to Criminal Justice; Appropriations Committee on Criminal and Civil Justice; Fiscal Policy |
2025-02-28 | • Filed |
Why Relevant: The bill imposes a binding duty on device manufacturers to implement a technical filter that blocks harmful content for minors, which is a form of age-gating and content restriction.
Mechanism of Influence: Manufacturers must ensure every device sold or activated in Florida comes with a filter that prevents minors from accessing harmful content, and this filter must be enabled by default and difficult to remove.
Evidence:
Ambiguity Notes: The bill does not specify the technical details of the filter, how 'harmful material' is defined, nor does it require ongoing risk assessments or changes to moderation/recommender systems.
Why Relevant: The law creates liability for both manufacturers and individuals who enable minors to bypass the filter, and allows for enforcement by the Attorney General and private lawsuits.
Mechanism of Influence: Enforcement mechanisms (civil/criminal) ensure compliance and create strong incentives for manufacturers to implement robust filtering.
Evidence:
Ambiguity Notes: While strong on enforcement, the bill does not mention detection/scanning technologies, reporting to authorities, or requirements for data preservation.
Why Relevant: The bill mandates age-gating (filtering based on minor status) at the device level, which is a core element of the EU Chat Act's approach to child protection online.
Mechanism of Influence: By requiring a filter that blocks minors from accessing certain content, the bill effectively mandates age verification or assessment at the device level.
Evidence:
Ambiguity Notes: No explicit requirement for verifying the age of the user, but the obligation to filter for minors suggests some age assessment or declaration during device setup.
This bill amends existing laws regarding social media use by minors, specifically focusing on account management for users under the age of 16. It mandates social media platforms to terminate accounts for users under 14, allow parents to view messages, and restrict the use of disappearing messages. Additionally, it requires platforms to provide a way to decrypt messages for law enforcement when warranted.
Date | Action |
---|---|
2025-06-16 | • Died in Messages |
2025-05-03 | • Indefinitely postponed and withdrawn from consideration |
2025-04-24 | • In Messages |
2025-04-24 | • Read 2nd time -SJ 552 • Amendment(s) adopted (736458) -SJ 552 • Read 3rd time -SJ 553 • CS passed as amended; YEAS 34 NAYS 3 -SJ 553 |
2025-04-21 | • Placed on Special Order Calendar, 04/24/25 |
2025-04-16 | • Favorable by- Rules; YEAS 21 NAYS 0 • Placed on Calendar, on 2nd reading |
2025-04-11 | • On Committee agenda-- Rules, 04/16/25, 8:30 am, 412 Knott Building |
2025-04-02 | • Now in Rules |
Why Relevant: Requires social media platforms to provide 'decryption mechanisms for law enforcement with a warrant.'
Mechanism of Influence: Mandates that platforms be technically capable of decrypting user messages for law enforcement, which would undermine end-to-end encryption and require client-side scanning or equivalent access.
Evidence:
Ambiguity Notes: Does not specify technical implementation, but the requirement to provide decryption on demand is a strong signal of compelled access to private communications.
Why Relevant: Mandates parental access to all messages for minors under 16.
Mechanism of Influence: Requires platforms to allow parents to view all messages, which would necessitate either storing messages in a readable form or providing a mechanism to bypass encryption for parental review.
Evidence:
Ambiguity Notes: Not explicit about technical means, but implies no strong encryption or at least a parental-access backdoor.
Why Relevant: Prohibits disappearing or self-destructing messages for minors.
Mechanism of Influence: Requires platforms to disable or restrict features that automatically delete communications, which is a mitigation measure targeting risk of child exploitation.
Evidence:
Ambiguity Notes: Does not require scanning, but functionally limits privacy features.
Why Relevant: Mandates age-based account management and parental consent for minors.
Mechanism of Influence: Requires platforms to identify user ages and enforce account termination or parental controls, which necessitates some form of age verification or assessment.
Evidence:
Ambiguity Notes: Does not specify the verification mechanism, but duties are explicit.
Legislation ID: 27818
Bill URL: View Bill
This bill amends Chapter 489X of the Hawaii Revised Statutes to include provisions regarding the publishing or distributing of material harmful to minors on the internet. It defines commercial entities and material harmful to minors, and imposes civil penalties for violations of age verification requirements. The bill also outlines the jurisdiction for legal actions and specifies that it does not impose liability on internet service providers.
Date | Action |
---|---|
2025-01-27 | Referred to CPC, JHA, referral sheet 4 |
Why Relevant: The bill explicitly imposes an age verification requirement for access to certain online content.
Mechanism of Influence: Commercial entities must verify user age before granting access to 'material harmful to minors,' using a commercial database or another reasonable method.
Evidence:
Ambiguity Notes: The bill does not specify the technical details of 'another reasonable method,' leaving room for interpretation.
Legislation ID: 27832
Bill URL: View Bill
This bill establishes a new chapter in the Hawaii Revised Statutes focused on internet protections for minors. It mandates that any commercial entity publishing pornographic material on websites accessible in Hawaii must verify the age of users to ensure they are over eighteen. The bill outlines acceptable methods for age verification, specifies exemptions for certain entities, and establishes penalties for non-compliance.
Date | Action |
---|---|
2025-01-27 | Referred to CPC, JHA, referral sheet 4 |
Why Relevant: The bill mandates age verification for access to certain online content, which is one of the core elements associated with the EU Chat Act's approach to online child protection.
Mechanism of Influence: It requires commercial pornographic websites to use 'reasonable age verification methods' such as digital ID or age verification software, and prohibits retention of identifying information.
Evidence:
Ambiguity Notes: The bill's age verification duty is limited to pornographic material and does not extend to general online services, messaging, or social media. There are no requirements for risk assessments, ongoing mitigation, or detection/scanning for CSAM.
Legislation ID: 29647
Bill URL: View Bill
This bill introduces the Hawaii Age Verification for App Developers and App Stores Act aimed at ensuring that app developers and app store providers implement robust age verification systems. It defines age categories for users, mandates parental consent for minors accessing apps, and requires app stores to display age ratings and provide parental controls. The bill seeks to prevent minors from downloading inappropriate content and to ensure developers comply with age verification standards.
Date | Action |
---|---|
2025-01-27 | Referred to CPN/LBT, JDC. |
Why Relevant: The bill imposes explicit age verification and parental consent requirements on both app stores and developers, including mandates to verify user age and restrict minors’ access to apps.
Mechanism of Influence: App stores and developers must implement technical measures to verify user age and obtain parental consent, and restrict minors from accessing apps based on age ratings. This likely involves collecting and verifying user data at the point of app download or account creation.
Evidence:
Ambiguity Notes: The specific technical means of age verification are not detailed; the scope of 'robust' age verification and what constitutes 'verifiable parental consent' may vary based on rules adopted by the Department of Commerce and Consumer Affairs.
Why Relevant: The bill creates obligations for app stores to display age ratings, provide parental controls, and for developers to give content descriptions and ratings.
Mechanism of Influence: This facilitates parental oversight and limits exposure of minors to inappropriate content through technical controls and disclosures.
Evidence:
Ambiguity Notes: The bill does not specify how granular or effective the parental controls must be, nor does it mandate risk assessments or mitigation plans beyond age verification.
Legislation ID: 27194
Bill URL: View Bill
This bill introduces new provisions in the Hawaii Revised Statutes regarding the operation of addictive social media platforms and the handling of personal data of minors. It defines key terms, outlines prohibitions on addictive feeds for users identified as minors, and sets requirements for obtaining parental consent. Additional regulations are established for data processing and privacy protections for minors, as well as penalties for non-compliance.
Date | Action |
---|---|
2025-01-21 | Referred to ECD, CPC/JHA, FIN, referral sheet 2 |
Why Relevant: The bill requires operators to use 'reasonable methods' to determine if a user is a minor and to obtain 'verifiable parental consent' before providing addictive feeds to minors.
Mechanism of Influence: This is a form of age verification/assessment and gating for access to certain features, similar to core elements in the EU Chat Act.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification or parental consent, leaving room for interpretation.
This bill introduces the Childrens Device Protection Act, which requires all tablets and smartphones activated in Idaho to have built-in internet filters that block access to obscene materials. It establishes manufacturer liability if minors can access such content due to non-compliance with these filtering requirements. The bill also empowers the Attorney General to enforce compliance and impose penalties for violations.
Date | Action |
---|---|
2025-03-06 | Reported Printed; referred to State Affairs |
2025-03-05 | Introduced; read first time; referred to JR for Printing |
Why Relevant: The bill mandates age assessment ('determine the age of the user during activation') and requires technical mitigation (built-in filters) to prevent minors from accessing certain content.
Mechanism of Influence: Manufacturers must implement age checks and enable content filters by default for minors, which is a form of age verification and technical risk mitigation at the device level.
Evidence:
Ambiguity Notes: The bill does not specify the technical standard for age determination or filtering (e.g., biometric, ID verification, self-declaration), nor does it require ongoing risk assessments or provider-level risk mitigation.
Why Relevant: The bill establishes manufacturer liability for non-compliance, enforced by the state Attorney General.
Mechanism of Influence: Manufacturers face civil penalties if their devices allow minors to access obscene material due to absent or disabled filters, incentivizing compliance.
Evidence:
Ambiguity Notes: Liability is limited by a 'good faith' compliance effort clause; the threshold for this is undefined.
This bill introduces the "Standard Protection for All Resources on Kids Devices Act," which mandates the implementation of filtering software on devices used by minors to prevent access to harmful content. It outlines the responsibilities of manufacturers and app developers, establishes liability for non-compliance, and empowers the Attorney General to enforce these provisions.
Date | Action |
---|---|
2025-03-13 | Reported Printed; referred to State Affairs |
2025-03-12 | Introduced; read first time; referred to JR for Printing |
Why Relevant: The bill mandates risk mitigation via device-level filtering, requiring manufacturers and app developers to prevent minors' access to harmful content and to disclose risks.
Mechanism of Influence: Manufacturers must install and enable filters for minors, and app developers must identify and disclose apps' potential to access harmful content or bypass controls. App stores must gate harmful apps behind password protections.
Evidence:
Ambiguity Notes: The definition of 'harmful content' and the technical standards for filtering are not fully detailed, but the mandate is explicit.
Why Relevant: The bill requires age determination for device setup, a form of age verification/assessment.
Mechanism of Influence: Devices must determine the user's age during setup to enable or disable filters, directly impacting access to content based on age.
Evidence:
Ambiguity Notes: It is not specified how age is determined or what verification methods are acceptable.
Why Relevant: The bill imposes app-store gatekeeping by requiring password gating for harmful apps and disclosures in the app store.
Mechanism of Influence: App stores must prevent downloads of identified harmful apps without a filter password, and must display disclosures about app capabilities.
Evidence:
Ambiguity Notes: The criteria for 'harmful' apps and the process for classification are not fully specified.
Why Relevant: The bill requires notification to parents of download and bypass attempts, a form of reporting/internal controls.
Mechanism of Influence: Filters must notify parents when minors attempt to download or bypass restricted content.
Evidence:
Ambiguity Notes: It is not clear whether these notifications are real-time or what information is included.
Why Relevant: The bill creates liability for manufacturers and developers and empowers the Attorney General to enforce compliance, establishing a compliance infrastructure.
Mechanism of Influence: Liability provisions and enforcement by the Attorney General create incentives for compliance and establish a legal infrastructure for oversight.
Evidence:
Ambiguity Notes: No explicit requirement for a legal representative or in-state point of contact, but enforcement authority is clear.
The Online Age Verification for Material Harmful to Minors Act mandates that any commercial entity distributing material deemed harmful to minors must ensure that users are at least 18 years old. Verification can be conducted through established databases or other reasonable methods. The bill establishes civil liability for entities that fail to comply, while also exempting certain internet service providers from liability.
Date | Action |
---|---|
2025-01-09 | Referred toRules Committee |
2025-01-09 | First Reading |
2025-01-03 | Prefiled with Clerk byRep. Jed Davis |
Why Relevant: The bill mandates age verification for access to certain online content, which is a core element (age verification/assessment) seen in the EU Chat Act.
Mechanism of Influence: Commercial entities must implement systems to verify users' ages before granting access to material harmful to minors. This likely means collecting personal information or using third-party verification services, impacting user privacy and anonymity.
Evidence:
Ambiguity Notes: 'Commercially available database or another reasonable method' is broad and could be interpreted to include various intrusive or less-intrusive verification mechanisms. However, the bill does not specify requirements for ongoing risk assessment, mitigation plans, or technology mandates beyond age checks.
The Digital Age Assurance Act mandates that manufacturers of devices take reasonable steps to determine the age of primary users upon device activation. It requires websites and applications that host mature content to implement age verification measures and provide parental controls for minors. The Act empowers the Attorney General to enforce compliance and sets provisions for civil actions against violators, while limiting home rule authority on related regulations.
Date | Action |
---|---|
2025-03-21 | Rule 19(a) / Re-referred toRules Committee |
2025-03-11 | Assigned toExecutive Committee |
2025-03-06 | Added Co-SponsorRep. Maurice A. West, II |
2025-02-18 | Referred toRules Committee |
2025-02-18 | First Reading |
2025-02-06 | Filed with the Clerk byRep. Jennifer Gong-Gershowitz |
Why Relevant: The Act contains explicit, binding obligations for device manufacturers to determine or estimate user age and for websites to implement age verification and parental controls.
Mechanism of Influence: Manufacturers must assess age at device activation and communicate this to online services. Websites must block under-18 users from mature content and add parental controls, directly affecting user access and privacy.
Evidence:
Ambiguity Notes: The terms 'reasonable steps' and 'determine or estimate the user's age' are broad and could be interpreted to require anything from self-declaration to biometric or ID-based verification. The requirement to provide a 'digital signal' to online services is also undefined in technical detail.
Why Relevant: The Act mandates age verification for access to mature content on websites and apps.
Mechanism of Influence: Websites must block users under 18 if a digital age signal indicates so, which likely requires integration with device-level or third-party age assurance systems.
Evidence:
Ambiguity Notes: It is not specified how robust or privacy-invasive the age verification must be, nor whether it applies to interpersonal communication services or only to mature content.
Why Relevant: The Act mandates parental controls for minors on websites with mature content.
Mechanism of Influence: Websites must implement technical controls to allow parents to restrict or monitor access, which may require collecting additional information about users and their guardians.
Evidence:
Ambiguity Notes: Scope of required parental controls is unspecified; could range from simple warnings to granular activity monitoring.
The Artificial Intelligence Safety and Security Protocol Act mandates developers of artificial intelligence to create, implement, and publish safety protocols that address critical risks associated with their models. Developers are required to conduct regular risk assessments, undergo annual audits by third parties, and provide whistleblower protections. The Act also includes provisions for civil penalties for non-compliance, ensuring that developers are held accountable for the safety and security of their AI technologies.
Date | Action |
---|---|
2025-04-11 | Rule 19(a) / Re-referred toRules Committee |
2025-04-11 | House Floor Amendment No. 2 Rule 19(c) / Re-referred toRules Committee |
2025-04-08 | House Floor Amendment No. 2 Rules Refers toCybersecurity, Data Analytics, & IT Committee |
2025-04-08 | House Floor Amendment No. 2 Referred toRules Committee |
2025-04-08 | House Floor Amendment No. 2 Filed with Clerk byRep. Daniel Didech |
2025-03-26 | Held on Calendar Order of Second Reading - Short Debate |
2025-03-26 | Second Reading - Short Debate |
2025-03-20 | House Committee Amendment No. 1 Adopted inCybersecurity, Data Analytics, & IT Committee; 007-004-000 |
Why Relevant: The Act includes explicit duties for AI developers to conduct and publish regular risk assessments and to implement mitigation protocols for critical risks.
Mechanism of Influence: Developers must create and publish a 'safety and security protocol' and 'risk assessment reports' every 90 days, and these must address how critical risks are managed and mitigated. Annual third-party audits are also required.
Evidence:
Ambiguity Notes: The term 'critical risks' is not defined in the provided excerpt, so the scope of what must be mitigated could be broad.
Why Relevant: The Act requires ongoing audits and public reporting, which are compliance infrastructure elements.
Mechanism of Influence: Annual third-party audits must be conducted and published, and developers must retain and justify redactions for at least 5 years, supporting transparency and oversight.
Evidence:
Ambiguity Notes: No mention of specific technical compliance infrastructure (e.g., designated legal representatives or in-jurisdiction contacts), but audit and record-keeping is mandated.
Why Relevant: The Act provides whistleblower protections and anonymous disclosure processes, which support internal controls and oversight.
Mechanism of Influence: Developers must allow employees to report unsafe practices anonymously and must maintain these disclosures for at least 7 years.
Evidence:
Ambiguity Notes: No explicit mention of user redress or appeal mechanisms for external parties.
This bill establishes requirements for manufacturers of devices and online services to take reasonable steps to determine the age of users and to restrict access to mature content for users under 18. It mandates that websites and applications provide parental controls and comply with these age assurance requirements in a nondiscriminatory manner. The Attorney General is granted the authority to enforce these provisions through civil actions, ensuring compliance and imposing penalties for violations.
Date | Action |
---|---|
2025-06-02 | Rule 3-9(a) / Re-referred toAssignments |
2025-05-23 | Rule 2-10 Committee/3rd Reading Deadline Established As June 1, 2025 |
2025-05-09 | Rule 2-10 Committee/3rd Reading Deadline Established As May 23, 2025 |
2025-04-29 | Rule 2-10 Committee/3rd Reading Deadline Established As May 9, 2025 |
2025-04-29 | Re-assigned toExecutive |
2025-04-11 | Rule 3-9(a) / Re-referred toAssignments |
2025-03-21 | Rule 2-10 Committee Deadline Established As April 11, 2025 |
2025-03-20 | Chief Sponsor Changed toSen. Willie Preston |
Why Relevant: The bill explicitly mandates 'measures to determine the age of device users' and requires that 'websites and applications restrict access to mature content based on the user's age.' App stores must 'obtain parental consent for users under 16.'
Mechanism of Influence: It imposes binding age verification/assessment duties on device manufacturers, app stores, and online services, with legal enforcement.
Evidence:
Ambiguity Notes: The bill does not specify technical details for age assurance (e.g., biometric, document, or other methods), leaving room for broad interpretation. It does not address risk assessments, scanning, reporting, or encryption.
This bill creates the Adult Content Age Verification Act, which mandates that commercial entities that publish or distribute material harmful to minors must implement reasonable age verification methods. Entities failing to comply may face civil penalties, which will be directed to the Cyber Exploitation of Children Fund to support investigations of cybercrimes against children. The Attorney General is empowered to investigate violations and initiate legal actions.
Date | Action |
---|---|
2025-02-28 | Added as Co-SponsorSen. Li Arellano, Jr. |
2025-02-06 | Referred toAssignments |
2025-02-06 | First Reading |
2025-02-06 | Filed with Secretary bySen. Erica Harriss |
Why Relevant: The bill mandates 'reasonable age verification methods' for commercial entities providing access to material harmful to minors.
Mechanism of Influence: It compels platforms to verify the age of users accessing certain content, which may require collection of sensitive data or use of third-party verification services.
Evidence:
Ambiguity Notes: The term 'reasonable age verification methods' is not further defined, leaving open what methods must be used and how stringent or privacy-intrusive they must be.
The Parental Consent for Social Media Act establishes regulations for social media companies operating in Illinois regarding minors. It mandates that minors cannot hold accounts without parental consent, requires age verification through third-party vendors, and restricts access for minors during specified late-night hours. The bill also outlines liabilities for social media companies and third-party vendors for non-compliance.
Date | Action |
---|---|
2025-06-02 | Rule 3-9(a) / Re-referred toAssignments |
2025-05-23 | Rule 2-10 Committee/3rd Reading Deadline Established As June 1, 2025 |
2025-05-09 | Rule 2-10 Committee/3rd Reading Deadline Established As May 23, 2025 |
2025-04-11 | Rule 2-10 Committee/3rd Reading Deadline Established As May 9, 2025 |
2025-03-21 | Rule 2-10 Committee Deadline Established As April 11, 2025 |
2025-03-19 | ToAI and Social Media |
2025-03-12 | Assigned toExecutive |
2025-02-07 | Referred toAssignments |
Why Relevant: The bill mandates 'age verification through third-party vendors' and requires 'parental consent for minors to create accounts.'
Mechanism of Influence: Social media companies must implement reliable age checks and obtain parental consent before allowing minors to use their services. This directly targets age-verification and minor-access control.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification, nor does it address risk assessments, mitigation, or detection/scanning for CSAM.
This Act establishes guidelines for covered entities operating in Illinois that collect or process personal data of children under 18. It mandates that these entities assess and document their data protection practices to safeguard childrens privacy and interests. The Act includes provisions for data protection impact assessments, privacy settings, and enforcement mechanisms by the Attorney General, while also creating a dedicated enforcement fund.
Date | Action |
---|---|
2025-04-11 | Rule 3-9(a) / Re-referred toAssignments |
2025-04-04 | Added as Chief Co-SponsorSen. Mary Edly-Allen |
2025-03-21 | Rule 2-10 Committee Deadline Established As April 11, 2025 |
2025-03-19 | ToAI and Social Media |
2025-03-12 | Assigned toExecutive |
2025-01-13 | Referred toAssignments |
2025-01-13 | First Reading |
2025-01-13 | Filed with Secretary bySen. Sue Rezin |
Why Relevant: The Act requires 'data protection impact assessments' and ongoing documentation for online services likely accessed by children, which is analogous to the EU Chat Act's risk assessment features.
Mechanism of Influence: Covered entities must assess and update their data processing risks and mitigation for children, and provide these assessments to the Attorney General on request. They must also set high privacy defaults and provide tools for privacy rights.
Evidence:
Ambiguity Notes: The DPIA duty is broad and may cover a range of privacy and safety risks, but it does not specifically require assessment or mitigation of child sexual abuse or online harms beyond privacy.
Why Relevant: The Act empowers the Attorney General to review DPIAs and enforce compliance, which is a compliance-infrastructure feature, but does not create new reporting or centralized routing mechanisms as in the EU Chat Act.
Mechanism of Influence: Attorney General may request and review DPIAs to ensure compliance, but enforcement is limited to data protection obligations.
Evidence:
Ambiguity Notes: No explicit requirement for a designated legal representative or in-jurisdiction point of contact beyond responding to AG requests.
This bill establishes guidelines for businesses that provide online services, products, or features likely to be accessed by children. It mandates the completion of data protection impact assessments, outlines specific requirements for data management practices, and creates a working group to recommend best practices for implementation. Violations of the Act may result in civil penalties, thereby encouraging compliance to protect childrens data privacy.
Date | Action |
---|---|
2025-04-11 | Senate Committee Amendment No. 1 Rule 3-9(a) / Re-referred toAssignments |
2025-04-11 | Senate Committee Amendment No. 2 Rule 3-9(a) / Re-referred toAssignments |
2025-04-11 | Rule 3-9(a) / Re-referred toAssignments |
2025-04-04 | Added as Chief Co-SponsorSen. Mary Edly-Allen |
2025-04-03 | Senate Committee Amendment No. 2 ToAI and Social Media |
2025-04-03 | Senate Committee Amendment No. 1 ToAI and Social Media |
2025-04-01 | Senate Committee Amendment No. 2 Assignments Refers toExecutive |
2025-03-21 | Rule 2-10 Committee Deadline Established As April 11, 2025 |
Why Relevant: Mandates 'data protection impact assessment' and 'mitigation plan' for risks to children, with ongoing obligations to assess and mitigate risks.
Mechanism of Influence: Requires businesses to assess risks to children for each online service, document risks, and create mitigation plans (e.g., adjust privacy settings, enforce policies).
Evidence:
Ambiguity Notes: The language on 'mitigation plan' could include a range of interventions, but there is no explicit mention of technical measures like scanning or moderation changes.
Why Relevant: Requires businesses to estimate the age of child users or apply protections to all consumers.
Mechanism of Influence: This could imply some form of age assessment or verification, though the bill does not specify technical or procedural requirements.
Evidence:
Ambiguity Notes: Does not mandate a specific age verification technology or process.
Why Relevant: Mandates 'default privacy settings' and clear privacy information, as well as tools for exercising privacy rights.
Mechanism of Influence: Businesses must configure services for high privacy by default and provide clear privacy tools and information for children.
Evidence:
Ambiguity Notes: No specific requirements for content moderation, detection/scanning, or app store obligations.
House Bill No. 1321 establishes guidelines for social media services regarding minors. It mandates that social media platforms cannot permit Indiana residents under 18 to create accounts without written consent from a parent or guardian. The bill also outlines specific configurations for accounts held by minors and sets penalties for violations, including enforcement actions by the attorney general and civil actions by parents or guardians. Additionally, it addresses the retention and use of personal information for age verification purposes.
Date | Action |
---|---|
2025-01-27 | Representative Teshka added as coauthor |
2025-01-13 | First reading: referred to Committee on Judiciary |
2025-01-13 | Authored by Representative King |
Why Relevant: The bill imposes an explicit age verification mandate for social media services, requiring written parental consent for minors to create accounts, and restricts the handling of personal information for age verification purposes.
Mechanism of Influence: Social media services must implement processes to verify the age of users and obtain parental consent for users under 18, and must configure accounts for minors according to statutory requirements. The bill restricts the use and retention of personal information collected for this purpose.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification or account configurations, nor does it address how platforms are to ensure compliance beyond requiring written consent.
Senate Bill No. 11 mandates that social media operators restrict access for users under the age of 16 unless they have obtained verifiable parental consent. The bill outlines definitions related to minors and social media, establishes the requirements for parental consent, and grants the attorney general the authority to enforce compliance through legal actions against non-compliant operators. It also ensures the confidentiality of any information collected about minor users and sets penalties for violations.
Date | Action |
---|---|
2025-03-03 | First reading: referred to Committee on Judiciary |
2025-01-24 | Referred to the House |
2025-01-23 | Senator Young M added as coauthor |
2025-01-23 | House sponsor: Representative Pressel |
2025-01-23 | Third reading: passed; Roll Call 13: yeas 42, nays 7 |
2025-01-23 | Cosponsors: Representatives King and Jeter |
2025-01-21 | Amendment #3 (Brown L) prevailed; voice vote |
2025-01-21 | Second reading: amended, ordered engrossed |
Why Relevant: The bill imposes an explicit age-verification requirement for users under 16, requiring parental consent before access.
Mechanism of Influence: Social media operators must implement systems to reliably identify users under 16 and prevent access without parental consent. This may involve age assessment/verification technologies.
Evidence:
Ambiguity Notes: The bill does not specify the technical means of age verification, leaving room for interpretation regarding the intrusiveness or reliability of the required process.
Legislation ID: 67495
Bill URL: View Bill
House File 278, known as the Social Media Parental Authorization Act, mandates that social media companies obtain parental consent before allowing minors to create accounts. It outlines the definitions of key terms, sets forth requirements for parental authorization, and establishes civil penalties for violations. The bill also allows parents to revoke authorization and mandates that companies provide access to minors accounts for parental monitoring.
Date | Action |
---|---|
2025-03-05 | Committee report approving bill, renumbered asHF 798. |
2025-02-25 | Committee vote: Yeas, 19. Nays, 2.H.J. 433. |
2025-02-25 | Committee report, recommending amendment and passage.H.J. 433. |
2025-02-19 | Subcommittee recommends passage. |
2025-02-17 | Subcommittee Meeting: 02/19/2025 4:00PM House Lounge. |
2025-02-11 | Subcommittee: Fett, Wheeler and Wilburn.H.J. 291. |
2025-02-10 | Introduced, referred to Judiciary.H.J. 273. |
Why Relevant: The bill imposes a form of age verification by requiring parental authorization for minors to create accounts on social media platforms.
Mechanism of Influence: Social media companies must implement mechanisms to verify parental consent before account creation for minors, effectively serving as an age gate.
Evidence:
Ambiguity Notes: The bill does not specify the technical means of verifying age or parental status, leaving open the possibility for a range of verification methods.
Why Relevant: The bill grants parents ongoing access to monitor and restrict their child's social media account.
Mechanism of Influence: Platforms must provide parents with tools to oversee and set restrictions on their children's accounts, impacting privacy for minors.
Evidence:
Ambiguity Notes: The scope and technical implementation of parental monitoring are not detailed.
Legislation ID: 60691
Bill URL: View Bill
House File 62 establishes civil liability for commercial entities that knowingly publish or distribute obscene material online without proper age verification methods to prevent minors from accessing such material. It mandates the use of commercially available databases or other reasonable methods for age verification. The bill does not impose liability on internet service providers.
Date | Action |
---|---|
2025-03-21 | Withdrawn.H.J. 772. |
2025-03-07 | Committee report approving bill, renumbered asHF 864. |
2025-03-04 | Committee vote: Yeas, 20. Nays, 0. Excused, 1.H.J. 505. |
2025-03-04 | Committee report, recommending amendment and passage.H.J. 505. |
2025-02-26 | Sponsor added, Young.H.J. 440. |
2025-01-29 | Subcommittee recommends passage. |
2025-01-22 | Subcommittee Meeting: 01/29/2025 12:00PM RM 19. |
2025-01-16 | Subcommittee: Wheeler, Fett and Srinivas.H.J. 98. |
Why Relevant: The bill imposes a binding duty on covered platforms to perform age verification for users accessing certain content, which is a core element of the EU Chat Act.
Mechanism of Influence: Platforms must implement mechanisms to reliably determine user age before granting access to obscene material, typically requiring government ID or similar credentials.
Evidence:
Ambiguity Notes: The requirement is limited to 'obscene material' and does not extend to broader categories like all social media or messaging services. 'Reasonable age verification' is defined by reference to government ID or reliable documents, but the specifics of implementation are left to platforms.
Legislation ID: 67605
Bill URL: View Bill
House File 278, known as the “Social Media Parental Authorization Act,” mandates that social media companies obtain parental consent before allowing minors to create accounts. It outlines definitions, requirements for parental authorization, enforcement mechanisms, and penalties for violations, including civil actions for harmed individuals.
Date | Action |
---|---|
2025-03-05 | Introduced, referred to Ways and Means.H.J. 521. |
Why Relevant: The bill mandates age verification via parental authorization for minors, which is a core element (age verification/assessment) of the EU Chat Act’s approach.
Mechanism of Influence: Social media companies must reliably confirm parental consent before minors can create accounts, which generally requires some form of age/identity verification for users believed to be under 18.
Evidence:
Ambiguity Notes: The bill defers details on 'acceptable forms' of verification to rulemaking, so the technical intrusiveness (e.g., biometric checks, document uploads) is not specified.
Why Relevant: The bill gives parents ongoing access to the minor’s account, which goes beyond simple age gating and affects user privacy.
Mechanism of Influence: Companies must allow parental access to minors’ accounts, potentially undermining privacy and autonomy for minor users.
Evidence:
Ambiguity Notes: The scope and technical means of access (e.g., full read/write access, monitoring tools) are not detailed.
Why Relevant: The bill prohibits data collection from minors without parental authorization, which is a privacy-related platform duty.
Mechanism of Influence: Companies must implement controls to prevent data collection from minors unless parental consent is verified.
Evidence:
Ambiguity Notes: No detail on technical enforcement or penalties for accidental collection.
Legislation ID: 63826
Bill URL: View Bill
House File 62 establishes civil liability for commercial entities that knowingly publish or distribute obscene material on the internet without verifying the age of the user. It mandates the use of age verification methods to prevent minors from accessing such material, while also clarifying that providers of interactive computer services are not liable under this law.
Date | Action |
---|---|
2025-06-16 | Referred to Technology.S.J. 1057. |
2025-04-03 | Placed on calendar under unfinished business.S.J. 689. |
2025-03-25 | Explanations of votes.H.J. 812. |
2025-03-25 | Explanation of vote.H.J. 811. |
2025-03-24 | Message from House.S.J. 607. |
2025-03-24 | Read first time, attached toSF 443.S.J. 607. |
2025-03-20 | Passed House, yeas 88, nays 1.H.J. 763. |
2025-03-20 | Immediate message.H.J. 767. |
Why Relevant: The bill mandates 'reasonable age verification' for access to certain online material, which is a core element of the EU Chat Act's approach to protecting minors online.
Mechanism of Influence: Covered platforms must implement age checks (e.g., government-issued ID) before granting access to obscene material, thereby restricting minors' access.
Evidence:
Ambiguity Notes: The scope is limited to obscene material, not general online services or all child safety risks. The bill does not address broader risk assessments, detection/scanning, or mitigation plans for platforms.
Legislation ID: 63056
Bill URL: View Bill
House File 62 establishes civil liability for commercial entities that knowingly or intentionally publish or distribute obscene material online. The bill mandates that these entities must verify the age of individuals attempting to access such material to ensure they are not minors. Failure to comply with this requirement will result in civil liability for damages incurred due to a minors access to obscene content. However, the bill explicitly states that it does not impose liability on providers or users of interactive computer services.
Date | Action |
---|---|
2025-02-26 | Committee report approving bill, renumbered asSF 443.S.J. 371. |
2025-02-10 | Subcommittee recommends passage. |
2025-02-06 | Subcommittee: Alons, Bennett, and Campbell.S.J. 210. |
2025-02-06 | Subcommittee Meeting: 02/10/2025 12:30PM Room 217 Conference Room. |
2025-02-04 | Introduced, referred to Technology.S.J. 189. |
Why Relevant: The bill imposes a duty on covered platforms to implement 'reasonable age verification' for users accessing obscene material.
Mechanism of Influence: Platforms must verify user age before granting access to certain material, directly affecting user onboarding and access flows.
Evidence:
Ambiguity Notes: The term 'reasonable age verification' is not further specified, but the bill requires that verification must not retain or disseminate identifying information and may use cryptographic techniques.
Why Relevant: The bill explicitly prohibits retention or dissemination of identifying information after age verification.
Mechanism of Influence: This provision aims to safeguard user privacy during the age verification process.
Evidence:
Ambiguity Notes: The bill does not detail technical standards for ensuring anonymity beyond the prohibition on retention.
Legislation ID: 61217
Bill URL: View Bill
House File 62 establishes regulations requiring commercial entities to verify the age of individuals attempting to access obscene material online. It mandates the use of reasonable age verification methods and holds entities accountable for damages if minors access such materials. However, it protects providers or users of interactive computer services from civil liability.
Date | Action |
---|---|
2025-06-16 | Referred to Technology.S.J. 1057. |
2025-04-03 | Placed on calendar under unfinished business.S.J. 689. |
2025-04-03 | Placed on calendar under unfinished business.S.J. 790. |
2025-03-24 | Attached toHF 864.S.J. 607. |
2025-02-26 | Committee report, approving bill.S.J. 371. |
2025-02-26 | Introduced, placed on calendar.S.J. 363. |
Why Relevant: The bill imposes a binding duty on platforms to implement age verification for access to certain online content.
Mechanism of Influence: Covered platforms must verify the age of users with 'reasonable age verification' before granting access to obscene material, using methods such as government-issued ID or financial documents.
Evidence:
Ambiguity Notes: The term 'reasonable age verification' is defined to some extent, but the bill leaves some discretion in the method, provided it meets the listed criteria.
Legislation ID: 165223
Bill URL: View Bill
This bill introduces a series of provisions aimed at protecting minors in Kentucky from potential online harms associated with social media and applications. It sets forth definitions, requirements for social media platforms regarding age restrictions, account management, and parental consent, and outlines the consequences for violations of these provisions. Additionally, it allows minors to seek legal recourse for violations of their rights under this act.
Date | Action |
---|---|
2025-03-04 | taken from Small Business & Information Technology (H) 2nd reading returned to Small Business & Information Technology (H) |
2025-02-28 | taken from Small Business & Information Technology (H) 1st reading returned to Small Business & Information Technology (H) |
2025-02-25 | to Small Business & Information Technology (H) |
2025-02-19 | introduced in House to Committee on Committees (H) |
Why Relevant: The bill directly mandates age verification and parental consent as preconditions for minors to access apps or social media, and imposes duties on app store providers and developers to enforce these requirements.
Mechanism of Influence: App stores must verify age and obtain parental consent before allowing minors to access or purchase apps; developers must ensure compliance and may be held liable for violations.
Evidence:
Ambiguity Notes: The bill refers to 'reliable methods' for age verification and leaves the specifics to future administrative regulation, which could vary in intrusiveness.
Why Relevant: The bill creates regulatory processes for age verification and compliance, and assigns enforcement to the Attorney General.
Mechanism of Influence: The regulatory agency will establish the specific methods for age verification and compliance infrastructure.
Evidence:
Ambiguity Notes: The bill does not specify technical requirements for detection, scanning, or risk assessment, focusing instead on age gating and consent.
Why Relevant: The bill assigns liability for violations, allows civil actions by minors and parents, and creates enforcement mechanisms.
Mechanism of Influence: Violations are classified as unfair practices, subject to penalties and enforcement by the Attorney General, and minors/parents can seek damages and injunctive relief.
Evidence:
Ambiguity Notes: The scope of liability is limited to age verification and parental consent, not broader platform moderation or scanning.
Senate Bill 181GA mandates local school boards to implement traceable communication systems for interactions between teachers or volunteers and students, restricting communications outside of these systems with specific exceptions. It also requires reporting of unauthorized communications to the Educational Professional Standards Board (EPSB), which must review these complaints within 120 days, potentially impacting state resources depending on the volume of reports.
Date | Action |
---|---|
2025-04-01 | signed by Governor (Acts Ch. 149) |
2025-03-28 | 3rd reading, passed 100-0 with Committee Substitute (1) and Committee Amendment (1-title) received in Senate to Rules (S) posted for passage for concurrence in House Committee Substitute (1) and Committee Amendment (1-title) Senate concurred in House Committee Substitute (1) and committee amendment (1-title) passed 38-0 enrolled, signed by President of the Senate enrolled, signed by Speaker of the House delivered to Governor |
2025-03-27 | taken from Committee on Committees (H) to Families & Children (H) reported favorably, to Rules with Committee Substitute (1) and Committee Amendment (1-title) taken from Rules placed in the Orders of the Day |
2025-03-14 | taken from Committee on Committees (H) 2nd reading returned to Committee on Committees (H) |
2025-03-13 | taken from Committee on Committees (H) 1st reading returned to Committee on Committees (H) |
2025-03-06 | 3rd reading, passed 37-0 with Floor Amendment (1) received in House to Committee on Committees (H) |
2025-03-05 | 2nd reading, to Rules posted for passage in the Regular Orders of the Day for Thursday, March 06, 2025 |
2025-03-04 | reported favorably, 1st reading, to Calendar floor amendment (1) filed |
Why Relevant: The bill imposes a platform-like duty on schools to ensure all teacher-student communications are traceable and centrally controlled, somewhat analogous to a risk mitigation measure.
Mechanism of Influence: Schools must approve and require use of monitored communication channels, which could deter or detect inappropriate contact.
Evidence:
Ambiguity Notes: The bill does not specify technical requirements for traceability (e.g., logging, content retention, access controls).
Why Relevant: The bill creates a mandatory reporting and review structure for violations, echoing reporting obligations in the EU Chat Act.
Mechanism of Influence: Any party aware of unauthorized communications must report to EPSB, which must review within 120 days.
Evidence:
Ambiguity Notes: No mention of centralized national/international reporting or transmission of metadata.
Legislation ID: 103777
Bill URL: View Bill
House Bill No. 37, known as the Kids Online Protection and Anti-Grooming Act, enacts provisions aimed at safeguarding minors on covered platforms. It outlines the responsibilities of platform operators in protecting the privacy and safety of minors, defines key terms, and sets forth penalties for violations. The bill emphasizes the importance of parental control and oversight in online interactions involving minors.
Date | Action |
---|---|
2025-06-11 | Signed by the Governor. Becomes Act No. 236. |
2025-06-11 | Effective date: 06/01/2026. |
2025-06-09 | Sent to the Governor for executive approval. |
2025-06-08 | Enrolled and signed by the Speaker of the House. |
2025-06-08 | Signed by the President of the Senate. |
2025-06-04 | Read by title, roll called, yeas 96, nays 0, Senate amendments concurred in. |
2025-06-03 | Scheduled for concurrence on 06/04/2025. |
2025-06-02 | Received from the Senate with amendments. |
Why Relevant: The Act imposes a duty on platforms to implement privacy and safety settings specifically for minors, including account visibility limits and connection restrictions.
Mechanism of Influence: Platforms must change product features and user interactions by default for minors, e.g., limiting account visibility, controlling messaging, and requiring parental involvement for adult-minor connections.
Evidence:
Ambiguity Notes: 'Duty of care' is somewhat broad, but the law lists specific required features. No explicit mention of risk assessment processes or written mitigation plans.
Why Relevant: Requires notification of legal representatives about minors' exposure to explicit material and connections, and enables parental management of settings.
Mechanism of Influence: Mandates parental controls and monitoring tools, with granular management of connections, account settings, and microtransactions.
Evidence:
Ambiguity Notes: No explicit mention of technical detection/scanning, but does require platforms to track/report certain user interactions to parents/guardians.
Why Relevant: Restricts adults from sending private messages to minors unless connected, and blocks geolocation sharing.
Mechanism of Influence: Affects interpersonal communication and sharing of sensitive data, but does not mandate scanning or detection of content/messages.
Evidence:
Ambiguity Notes: No mention of detection orders, scanning, or technical monitoring of message content; only structural restrictions.
Legislation ID: 186691
Bill URL: View Bill
This bill amends and reenacts existing laws to introduce specific provisions aimed at safeguarding minors data and interactions with applications. It outlines definitions related to age categories, the responsibilities of application store providers and developers, and the necessary measures for obtaining parental consent before allowing minors to access certain applications or make purchases. The bill seeks to ensure that minors are protected from inappropriate content and that their personal data is handled responsibly.
Date | Action |
---|---|
2025-06-30 | Signed by the Governor. Becomes Act No. 481. |
2025-06-30 | Effective date: 07/01/2026. |
2025-06-16 | Sent to the Governor for executive approval. |
2025-06-12 | Conference Committee report received. Lies over under the rules. |
2025-06-12 | Conference Committee Report read, roll called, yeas 98, nays 0. The Conference Committee Report was adopted. |
2025-06-12 | Conference Committee Report read; adopted by a vote of 38 yeas and 0 nays. |
2025-06-12 | Notice of Senate adoption of Conference Committee Report. |
2025-06-12 | Rules suspended. |
Why Relevant: The bill requires application stores and developers to conduct age verification at account creation and to obtain verifiable parental consent for minors.
Mechanism of Influence: App stores and developers must implement age verification mechanisms and ensure parental consent workflows, which could include technical systems to reliably determine user age and collect parental permissions.
Evidence:
Ambiguity Notes: The bill does not specify the technical method for age verification, leaving room for interpretation regarding the intrusiveness or reliability of the process.
Legislation ID: 172840
Bill URL: View Bill
This bill mandates that business entities which publish or distribute obscene material online must verify the age of individuals attempting to access such material. It outlines acceptable methods for age verification, prohibits retention of personal information after verification, and specifies the liabilities for businesses that fail to comply. The bill also includes exceptions for news organizations and provides for enforcement by the Attorney General.
Date | Action |
---|---|
2025-06-18 | Adhered To |
2025-06-16 | Insisted On |
Why Relevant: The bill includes a mandatory age-verification requirement for users seeking access to obscene matter online.
Mechanism of Influence: Online businesses must implement 'reasonable age-verification methods' to ensure users are 18 or older and prevent minors from accessing obscene content.
Evidence:
Ambiguity Notes: The phrase 'reasonable age-verification method' is not further defined in the abstract, so the technical specifics and intrusiveness depend on regulatory interpretation.
Why Relevant: The bill prohibits retention of identifying information after age verification.
Mechanism of Influence: This provision seeks to mitigate privacy risks by limiting data retention, though it does not specify technical standards.
Evidence:
Ambiguity Notes: It is unclear what constitutes 'identifying information' or how enforcement would verify compliance.
Legislation ID: 39405
Bill URL: View Bill
This bill establishes regulations for social media companies regarding the age of account holders. It prohibits individuals under 14 years of age from becoming account holders and mandates age verification for those seeking to create accounts. For minors aged 14 and 15, parental consent must be verified before they can create or maintain their accounts. The bill also outlines the enforcement mechanisms and penalties for violations, and it directs the Attorney General to create rules for implementing these requirements.
Date | Action |
---|---|
2025-06-16 | Accepted Report |
2025-03-04 | Referred to Committee |
2025-03-04 | Referred in Concurrence |
Why Relevant: The bill contains explicit, binding requirements for age verification and parental consent for minors, which are core elements of the EU Chat Act's approach to child protection online.
Mechanism of Influence: Social media companies must implement mechanisms to verify user age and, for 14- and 15-year-olds, verify and retain parental consent records. This likely means requiring ID upload or checks at registration, and ongoing compliance infrastructure to manage, store, and produce these records for at least two years.
Evidence:
Ambiguity Notes: While the bill is clear about age verification and parental consent, it does not specify technical standards or methods for verification (left to rulemaking), nor does it address risk assessment, content moderation, detection/scanning, reporting, or removal/blocking duties.
Legislation ID: 92912
Bill URL: View Bill
House Bill 1212 mandates that all devices activated in Maryland after January 1, 2026, must include a filter to block obscene material for users identified as minors. It prohibits anyone other than parents or legal guardians from deactivating these filters, and establishes civil and criminal liabilities for manufacturers that fail to comply. The Attorney General is empowered to enforce the law, and parents can take legal action if their children access obscene content due to non-compliance.
Date | Action |
---|---|
2025-02-06 | Hearing 2/25 at 1:00 p.m. |
2025-02-06 | First Reading Economic Matters |
Why Relevant: The bill mandates device-level age verification and filter activation for minors, which is a form of age verification and gating access to content.
Mechanism of Influence: Devices must determine user age at activation and enable filters for minors, preventing access to obscene content.
Evidence:
Ambiguity Notes: It is not specified what technical methods must be used for age verification, nor whether this extends to interpersonal communication services or only to web browsing/apps.
Why Relevant: The bill requires filtering technology to be enabled by default for minors, which is a form of compelled content blocking.
Mechanism of Influence: Manufacturers must install and enable filtering software on devices, and only parents or guardians can disable it.
Evidence:
Ambiguity Notes: The scope is limited to 'obscene material' and does not explicitly require detection or scanning for CSAM or solicitation, nor does it address encrypted communications.
Why Relevant: The bill establishes manufacturer liability and enforcement by the Attorney General for failure to comply with filtering requirements.
Mechanism of Influence: Manufacturers face civil and criminal penalties if their devices do not comply, and the Attorney General can enforce compliance.
Evidence:
Ambiguity Notes: Liability is tied to access to 'obscene material,' not specifically to CSAM or online grooming.
Why Relevant: The bill allows private lawsuits by parents if minors access obscene material due to non-compliance.
Mechanism of Influence: Parents or guardians may sue manufacturers or individuals for violations, creating a compliance incentive.
Evidence:
Ambiguity Notes: The cause of action is limited to access to obscene material, not broader child safety risks.
Legislation ID: 93123
Bill URL: View Bill
House Bill 1331 establishes guidelines and requirements for developers and deployers of artificial intelligence systems to mitigate risks of algorithmic discrimination. It mandates developers to provide comprehensive disclosures about their systems, implement risk management policies, and conduct impact assessments. Additionally, it addresses consumer rights regarding data correction and the opportunity to appeal decisions made by AI systems. The bill seeks to create a framework for responsible AI use and consumer protection.
Date | Action |
---|---|
2025-02-07 | First Reading Economic Matters |
2025-02-07 | Hearing 3/04 at 1:00 p.m. |
Why Relevant: The bill requires deployers of high-risk AI systems to implement risk management policies and conduct impact assessments, which echoes the risk assessment and mitigation plan elements of the EU Chat Act.
Mechanism of Influence: Mandates annual and pre-deployment impact assessments for 'algorithmic discrimination,' and requires risk management policies for high-risk AI systems.
Evidence:
Ambiguity Notes: The bill's focus is on algorithmic discrimination, not specifically on child sexual abuse material (CSAM) or online child safety risks.
Legislation ID: 91642
Bill URL: View Bill
House Bill 394 seeks to regulate the distribution of obscene material to minors on the Internet. It establishes that commercial entities distributing such material are liable for damages if minors access it. Additionally, it mandates that these entities implement reasonable age verification methods and prohibits them from retaining identifying information about individuals who seek access to this material. The bill also clarifies that certain entities, like internet service providers, are not liable under this law.
Date | Action |
---|---|
2025-02-03 | Hearing 2/05 at 2:00 p.m. |
2025-02-03 | Hearing canceled |
2025-01-17 | Hearing 2/05 at 1:00 p.m. |
2025-01-16 | First Reading Judiciary |
Why Relevant: The bill mandates that commercial entities implement 'reasonable age verification methods' for users attempting to access obscene material online (Subtitle 26). This is a core element of the EU Chat Act's approach to online child protection, specifically age verification/assessment.
Mechanism of Influence: Covered entities must deploy mechanisms to verify users' ages before granting access to certain online content, likely requiring users to present age or identity credentials. This could impact user privacy and access, though the bill also prohibits retention of identifying information.
Evidence:
Ambiguity Notes: 'Reasonable age verification methods' is not defined in detail, leaving ambiguity about the required technical implementation or acceptable standards. The bill does not specify technologies or mandate particular solutions (e.g., biometric, government ID, third-party checks).
Why Relevant: The bill prohibits commercial entities from retaining identifying information after granting or denying access. This is relevant to privacy and data minimization, which are often implicated in age verification schemes.
Mechanism of Influence: Entities cannot store personal data used for age checks, reducing risk of data breaches but possibly complicating repeat verification or audit trails.
Evidence:
Ambiguity Notes: The scope of 'identifying information' is not defined; no provisions on data security, audit, or oversight are included.
Legislation ID: 174865
Bill URL: View Bill
This legislation introduces Chapter 93M to the General Laws of Massachusetts, which sets forth definitions and regulations regarding addictive social media feeds. It establishes criteria for identifying minors using these platforms and prohibits certain practices that could expose minors to addictive content. The bill also mandates the Attorney General to create regulations to ensure compliance and outlines the enforcement mechanisms for violations.
Date | Action |
---|---|
2025-10-08 | Reporting date extended until Friday, November 14, 2025 |
2025-06-30 | Hearing scheduled for 07/10/2025 from 01:00 PM-05:00 PM in A-2 |
2025-06-23 | Senate concurred |
2025-05-19 | Reported, referred to the committee on Joint Rules, reported, rules suspended and referred to the committee onAdvanced Information Technology, the Internet and Cybersecurity |
2025-03-24 | Referred to the committee onHouse Rules |
Why Relevant: The bill contains explicit mandates for age verification before providing certain content to minors.
Mechanism of Influence: Operators must verify users' ages using 'reasonable methods' before allowing access to addictive feeds, and the Attorney General will set regulatory standards for this process.
Evidence:
Ambiguity Notes: The exact nature of 'reasonable methods' is left to future regulation, so the intrusiveness of verification could vary. There is no mention of biometric, document-based, or other specific verification techniques.
Why Relevant: The bill requires the Attorney General to promulgate rules and enforce compliance, which could include compliance infrastructure.
Mechanism of Influence: Operators will have to follow regulations and may need to implement compliance systems to avoid civil penalties.
Evidence:
Ambiguity Notes: The specific compliance infrastructure is not detailed, but the regulatory and enforcement framework is established.
Why Relevant: Mandates deletion of age determination data after use, which touches on privacy/data retention concerns.
Mechanism of Influence: Operators must delete age verification data after it is used, reducing long-term privacy risks.
Evidence:
Ambiguity Notes: No details on technical implementation or audit mechanisms.
Legislation ID: 84174
Bill URL: View Bill
This bill introduces regulations concerning the use of personal electronic devices in public schools and mandates educational policies regarding the risks associated with social media use. It seeks to create a structured approach to minimize distractions in educational environments and educate students about the potential harms of social media, while also establishing guidelines for social media platforms to protect minors.
Date | Action |
---|---|
2025-09-15 | Reporting date extended to Wednesday, December 17, 2025 |
2025-06-06 | Hearing scheduled for 06/17/2025 from 01:00 PM-05:00 PM in B-2 |
2025-02-27 | Senate concurred |
2025-02-27 | Referred to the committee onEducation |
Why Relevant: The bill explicitly requires social media platforms to implement age verification systems and set privacy-protective defaults for minors.
Mechanism of Influence: Platforms must deploy an 'age assurance system with 99% accuracy' and set privacy/engagement-limiting defaults for minors, directly imposing technical and operational duties on providers.
Evidence:
Ambiguity Notes: The language does not specify the technical means for age assurance or detail the scope of privacy/engagement restrictions, leaving room for interpretation about implementation depth.
Legislation ID: 85331
Bill URL: View Bill
This bill introduces Chapter 93M to the General Laws of Massachusetts, which outlines regulations for social media platforms that provide addictive feeds. It defines key terms, establishes unlawful practices for operators regarding minor users, and empowers the attorney general to enforce compliance through regulations and legal actions. The bill seeks to ensure that social media operators take reasonable measures to prevent minors from accessing addictive content and to regulate notifications sent to users during late-night hours.
Date | Action |
---|---|
2025-07-24 | Bill reported favorably by committee and referred to the committee onSenate Ways and Means |
2025-06-30 | Hearing scheduled for 07/10/2025 from 01:00 PM-05:00 PM in A-2 |
2025-02-27 | Referred to the committee onAdvanced Information Technology, the Internet and Cybersecurity |
2025-02-27 | House concurred |
Why Relevant: The bill explicitly requires operators to verify the age of users before providing 'addictive feeds' to minors, establishing a duty to implement age verification mechanisms.
Mechanism of Influence: Operators must implement age verification systems to restrict minors' access to addictive feeds and certain notifications, subject to regulatory standards. This is a direct, binding duty.
Evidence:
Ambiguity Notes: The bill leaves methods of age verification to future regulations, with a mandate to consider anonymity and proportionality. The scope of 'reasonable methods' may be interpreted broadly or narrowly depending on the attorney general's regulations.
Legislation ID: 85572
Bill URL: View Bill
This bill seeks to amend Chapter 71 of the General Laws to require public schools to implement policies regulating the use of personal electronic devices and to educate students about the risks of social media. These policies aim to minimize distractions in learning environments and promote student safety and well-being. Additionally, it proposes a new chapter focusing on online protection for minors, mandating social media platforms to enforce age verification and implement protective measures for underage users.
Date | Action |
---|---|
2025-07-10 | Accompanied a new draft, seeS2549 |
2025-06-06 | Hearing scheduled for 06/17/2025 from 01:00 PM-05:00 PM in B-2 |
2025-02-27 | House concurred |
2025-02-27 | Referred to the committee onEducation |
Why Relevant: The bill contains explicit statutory language requiring social media platforms to implement 'age verification systems with high accuracy' for minor users. This is a core feature of the EU Chat Act and similar regulatory regimes.
Mechanism of Influence: Platforms will be compelled to identify the ages of users and restrict certain features or content for minors, directly affecting onboarding, access, and possibly privacy for all users.
Evidence:
Ambiguity Notes: The bill does not specify the technical means for age verification, nor does it address proportionality, error rates, or privacy safeguards. It is unclear if the requirement applies to all communications or only certain platform features.
Why Relevant: The bill requires 'default privacy settings for minor users to limit exposure and engagement.' This is a risk mitigation feature, though less comprehensive than the EU Chat Act's ongoing risk assessment and mitigation planning.
Mechanism of Influence: Platforms must proactively set privacy defaults for minors, which could include limiting discoverability, messaging, or sharing. This is a form of product-level mitigation aimed at child safety.
Evidence:
Ambiguity Notes: No explicit obligation for ongoing risk assessment, mitigation planning, or audits. The scope of 'default privacy settings' is not detailed.
Legislation ID: 89523
Bill URL: View Bill
This bill proposes amendments to Chapter 12 of the General Laws of Massachusetts to create regulations for social media platforms regarding algorithm accountability. It defines key terms, establishes an office for oversight, mandates annual reporting on childrens interactions with social media, and sets up a framework for independent audits of algorithms to assess risks to children. The bill aims to protect minors from potential harms associated with algorithm-driven content delivery and to enhance transparency in how personal data is used by these platforms.
Date | Action |
---|---|
2025-08-28 | Hearing rescheduled to 09/11/2025 from 01:00 PM-05:00 PM in A-2 and VirtualHearing updated to include Virtual |
2025-02-27 | Referred to the committee onAdvanced Information Technology, the Internet and Cybersecurity |
2025-02-27 | House concurred |
Why Relevant: The bill requires 'independent third-party auditors to conduct monthly algorithm risk audits for covered platforms, focusing on potential harms to children.' It also mandates platforms to report on identified harms and mitigation steps.
Mechanism of Influence: Platforms must regularly assess and report risks to children from their algorithms, and take steps to mitigate identified harms, with oversight from an independent office and advisory council.
Evidence:
Ambiguity Notes: The bill does not specify the exact types of harms or mitigation measures required, leaving some discretion to the office and council.
Why Relevant: The bill establishes an Office of Social Media Transparency and Accountability to oversee compliance, receive reports, and maintain a list of auditors.
Mechanism of Influence: Creates compliance infrastructure and regulatory oversight, similar to the EU Chat Act's coordinating authority.
Evidence:
Ambiguity Notes: Does not specify direct technical requirements for platforms, but grants broad oversight powers.
Why Relevant: Mandates annual registration and transparency reports from covered platforms, including data on child users and engagement mechanisms.
Mechanism of Influence: Forces platforms to collect, analyze, and disclose information about children’s use and engagement, supporting ongoing regulatory monitoring.
Evidence:
Ambiguity Notes: Reporting is broad; the specific data elements may be determined by the office.
Why Relevant: Establishes civil penalties and injunctive relief for violations.
Mechanism of Influence: Provides enforcement tools to ensure compliance with the bill’s obligations.
Evidence:
Ambiguity Notes: Penalties are monetary and do not directly mandate technical changes.
Legislation ID: 128507
Bill URL: View Bill
House Bill No. 4388 seeks to regulate social media platforms by enforcing age verification processes for account holders, particularly minors. It mandates that social media companies confirm parental consent for minors and sets forth guidelines on how minors accounts should be managed, including restrictions on visibility and access times. The bill also outlines civil penalties for non-compliance and the powers of the attorney general in enforcing these regulations.
Date | Action |
---|---|
2025-09-18 | rule suspended |
2025-09-18 | referred to Committee onRegulatory Reform |
2025-09-18 | motion to discharge committee approved |
2025-09-18 | placed on second reading |
2025-04-29 | bill electronically reproduced 04/24/2025 |
2025-04-24 | introduced by Representative Rep. Mark Tisdel |
2025-04-24 | referred to Committee onCommunications and Technology |
2025-04-24 | read a first time |
Why Relevant: The bill requires social media companies to implement age verification for all account applicants and obtain parental consent for minors before allowing access.
Mechanism of Influence: Platforms must deploy technical systems to verify user ages and confirm parental consent, which may include collection of sensitive identification data. This is a strong signal for age verification/assessment obligations.
Evidence:
Ambiguity Notes: The bill does not specify particular technologies or standards for age verification, leaving room for interpretation.
Why Relevant: The bill prohibits access to minor accounts during late-night hours unless modified by parents.
Mechanism of Influence: Imposes technical controls on account access based on time and user age, requiring platforms to implement access restriction features.
Evidence:
Ambiguity Notes: Does not specify how platforms must implement these controls (e.g., by blocking logins or disabling features).
Why Relevant: Requires social media companies to provide parents access to view all posts and messages from minors' accounts.
Mechanism of Influence: Mandates a form of parental monitoring, which may require platforms to build infrastructure for parental oversight of private communications.
Evidence:
Ambiguity Notes: Does not specify if this includes end-to-end encrypted messages or how privacy is balanced.
Why Relevant: Prohibits targeted advertising and collection of personal information from minor accounts.
Mechanism of Influence: Requires platforms to implement controls to prevent data collection and ad targeting for minors, affecting internal data handling and advertising systems.
Evidence:
Ambiguity Notes: Does not mandate specific detection or scanning technologies.
Why Relevant: Minor accounts are not to appear in search results.
Mechanism of Influence: Platforms must implement technical controls to exclude minor accounts from discovery/search features.
Evidence:
Ambiguity Notes: No details on enforcement or technical standards.
Legislation ID: 140721
Bill URL: View Bill
House Bill No. 4429, known as the Digital Age Assurance Act, establishes regulations for websites, applications, and online services that host mature content. It mandates that covered manufacturers and digital platforms take necessary steps to verify the age of users, restrict access to mature content based on age, and implement parental controls. The bill also outlines enforcement mechanisms and penalties for non-compliance.
Date | Action |
---|---|
2025-09-18 | rule suspended |
2025-09-18 | referred to Committee onRegulatory Reform |
2025-09-18 | motion to discharge committee approved |
2025-09-18 | placed on second reading |
2025-05-07 | bill electronically reproduced 05/06/2025 |
2025-05-06 | read a first time |
2025-05-06 | referred to Committee onCommunications and Technology |
2025-05-06 | introduced by Representative Rep. Brad Paquette |
Why Relevant: The bill mandates 'estimate the age of users' and 'provide a digital signal regarding user age,' as well as requiring websites/apps to 'block access to mature content for users under 18.'
Mechanism of Influence: This creates a statutory obligation for both device manufacturers and online services to implement age verification/assessment and to use those signals to restrict content, similar to the EU Chat Act's age-verification and gating provisions.
Evidence:
Ambiguity Notes: The bill does not specify the technical means of age estimation or verification, nor does it mandate specific accuracy or privacy safeguards. It is unclear if biometric or document-based verification is required, or if less intrusive methods are permitted.
Legislation ID: 96497
Bill URL: View Bill
Senate Bill No. 190, known as the Social Media Children Protection Act, mandates social media companies to verify the age of users and obtain parental consent for minors under 16 years old. It outlines requirements for parental supervision of minor accounts, prohibits retention of personal identifying information used for verification, and establishes civil penalties for violations. The bill also declares certain contractual provisions void if they limit protections outlined in the act.
Date | Action |
---|---|
2025-03-20 | REFERRED TO COMMITTEE ONREGULATORY AFFAIRS |
2025-03-20 | INTRODUCED BY SENATOR THOMAS ALBERT |
Why Relevant: The bill explicitly mandates age verification for all users and parental consent for minors, which is a core feature of the EU Chat Act.
Mechanism of Influence: Social media companies must implement reliable systems to verify user ages at account creation and deny access if verification or consent is not obtained.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification or what constitutes sufficient 'verification.'
Why Relevant: The bill requires social media companies to provide parental supervision tools for minor accounts, including privacy and access controls.
Mechanism of Influence: Platforms must build and maintain functionality that allows parents to monitor and manage their children's account settings and usage.
Evidence:
Ambiguity Notes: The scope of 'parental supervision tools' is not exhaustively defined.
Why Relevant: The bill prohibits retention of personal identifying information used for age verification.
Mechanism of Influence: Platforms must design verification systems that do not store user PII for longer than necessary.
Evidence:
Ambiguity Notes: No guidance on technical safeguards or audit requirements.
Legislation ID: 96500
Bill URL: View Bill
Senate Bill No. 191, known as the Material Harmful to Minors Regulation Act, seeks to impose regulations on commercial entities that publish or distribute material deemed harmful to minors. It outlines definitions, responsibilities, and enforcement mechanisms for ensuring that minors are not exposed to inappropriate content on the internet. The bill mandates reasonable age verification methods for accessing such content and establishes penalties for non-compliance.
Date | Action |
---|---|
2025-03-20 | REFERRED TO COMMITTEE ONREGULATORY AFFAIRS |
2025-03-20 | INTRODUCED BY SENATOR THOMAS ALBERT |
Why Relevant: The bill imposes mandatory age verification for access to certain online material, a core element of the EU Chat Act's approach.
Mechanism of Influence: Commercial entities must implement 'reasonable age verification methods' before allowing access to content deemed harmful to minors (see 'Age Verification Requirements').
Evidence:
Ambiguity Notes: The bill does not specify technical methods for age verification or address interoperability, but the duty is clear.
Legislation ID: 140258
Bill URL: View Bill
This bill, known as the digital age assurance act, establishes requirements for covered manufacturers and digital platforms regarding the handling of mature content. It mandates that these entities take steps to determine the age of users and restrict access to mature content based on that age, ensuring compliance with specific guidelines to protect minors. The bill also outlines enforcement mechanisms and penalties for non-compliance.
Date | Action |
---|---|
2025-05-06 | REFERRED TO COMMITTEE ONFINANCE, INSURANCE, AND CONSUMER PROTECTION |
2025-05-06 | INTRODUCED BY SENATOR JOHN CHERRY |
Why Relevant: The bill requires websites/online services to recognize age signals and restrict access to mature content for users under 18, and mandates covered manufacturers to determine or estimate user age and communicate this to digital platforms.
Mechanism of Influence: Mandates robust age verification/assessment and parental consent for minors, directly imposing platform and manufacturer duties to identify child users and block/gate content.
Evidence:
Ambiguity Notes: The bill does not specify the technical methods for age determination or whether these measures apply to private communications or only to mature content. The scope of 'mature content' is not detailed here.
Why Relevant: The bill covers both device manufacturers and digital platforms, requiring them to implement compliance mechanisms and share age information.
Mechanism of Influence: Establishes compliance infrastructure (age signals, parental controls, enforcement authority, and penalties).
Evidence:
Ambiguity Notes: No explicit risk assessment, mitigation, detection/scanning, reporting, or removal/blocking orders are described. No mention of encryption or scanning of private communications.
Legislation ID: 32085
Bill URL: View Bill
This bill introduces regulations for commercial entities that share or distribute material deemed harmful to minors on their websites. It mandates that these entities verify the age of users accessing such material to ensure they are 18 years or older. The bill outlines definitions, requirements for age verification, data privacy protections, enforcement mechanisms, and the liabilities for violations.
Date | Action |
---|---|
2025-02-24 | Introduction and first reading, referred to Commerce Finance and Policy |
Why Relevant: The bill imposes a binding duty on covered commercial entities to verify the age of users before granting access to certain online content, specifically material harmful to minors.
Mechanism of Influence: Entities must implement technical measures to verify user age, which could include uploading government ID or using third-party verification services. This directly impacts user access and could affect privacy depending on methods used.
Evidence:
Ambiguity Notes: The bill does not specify the exact technical methods for age verification, only that they must be 'approved.' It does not mandate scanning, content moderation, or risk assessment beyond age gating. The definition of 'material harmful to minors' may be subject to interpretation.
Why Relevant: The bill includes a data privacy provision restricting retention of identifying information used for age verification.
Mechanism of Influence: Entities are prohibited from retaining identifying information after age verification, reducing risks of data breaches or misuse.
Evidence:
Ambiguity Notes: The bill does not detail what constitutes 'identifying information' beyond the definition section, nor how compliance will be audited.
Why Relevant: The bill establishes enforcement and penalties for non-compliance, creating a legal incentive for covered entities to implement age verification.
Mechanism of Influence: Attorney general investigations and civil actions by individuals create liability risks for non-compliant entities. This could encourage widespread adoption of age verification technologies.
Evidence:
Ambiguity Notes: No mention of criminal penalties or platform-wide scanning obligations. Enforcement is civil, not criminal.
Why Relevant: The bill explicitly excludes ISPs and users of interactive computer services from liability, narrowing its scope to commercial entities hosting or distributing harmful content.
Mechanism of Influence: This limits the bill’s reach to content providers, not infrastructure or user-to-user platforms.
Evidence:
Ambiguity Notes: Does not address platforms with user-generated content unless they fit the 'commercial entity' definition and host harmful material.
Legislation ID: 53380
Bill URL: View Bill
This bill establishes regulations for social media platforms regarding minors aged 15 and younger, requiring them to implement anonymous age verification and to prohibit minors under 14 from creating accounts without parental consent. It also outlines penalties for violations and the responsibilities of social media companies in managing accounts for minors. Furthermore, it addresses the dissemination of material harmful to minors and sets forth age verification requirements for commercial entities that publish such content.
Date | Action |
---|---|
2025-03-05 | Introduction and first reading, referred to Commerce Finance and Policy |
Why Relevant: The bill explicitly requires 'anonymous age verification' for both social media platforms and commercial entities publishing material harmful to minors.
Mechanism of Influence: Social media platforms must implement age verification systems to reliably identify minor users and to enforce age-based account restrictions and parental consent requirements.
Evidence:
Ambiguity Notes: The bill does not specify the technical method for 'anonymous age verification,' nor does it define the standards for reliability or privacy.
Why Relevant: The bill imposes duties on platforms to manage and terminate accounts for minors based on age and parental consent, and to delete personal data.
Mechanism of Influence: Platforms must assess user age and take action to terminate or restrict accounts, which could require new compliance infrastructure.
Evidence:
Ambiguity Notes: No mention of risk assessments, mitigation plans, or technical scanning for CSAM or solicitation.
Legislation ID: 33474
Bill URL: View Bill
This bill establishes regulations for social media platforms operating in Minnesota, specifically targeting algorithms that direct user-generated content towards minors. It defines key terms related to social media usage, sets forth prohibitions on algorithmic targeting, and outlines requirements for parental consent for minors. The bill also includes provisions for liability and penalties for violations, aiming to create a safer online environment for children.
Date | Action |
---|---|
2025-02-10 | Introduction and first reading, referred to Commerce Finance and Policy |
Why Relevant: The bill requires platforms to obtain verifiable parental consent for minors to open accounts, which is a form of age verification.
Mechanism of Influence: Platforms must implement mechanisms to reliably identify users under 18 and ensure parental consent, which may involve collecting age and identity information.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification, leaving room for interpretation on implementation.
Why Relevant: The bill prohibits algorithmic targeting of content to minors and requires a chronological feed, which is a mitigation of online risk for children.
Mechanism of Influence: Platforms must alter recommender systems and content delivery for minors, changing core product features to comply.
Evidence:
Ambiguity Notes: Does not require ongoing risk assessments or comprehensive mitigation plans beyond algorithmic changes.
Legislation ID: 30477
Bill URL: View Bill
This bill establishes regulations concerning social media platforms operating in Minnesota, particularly focusing on the use of algorithms that target minors. It defines key terms related to social media and outlines prohibitions against targeting user-generated content at minors through recommendation features. The bill also mandates parental consent for minors to create accounts and outlines penalties for non-compliance.
Date | Action |
---|---|
2025-02-17 | Introduction and first reading |
Why Relevant: The bill imposes a duty on platforms to prevent algorithmic targeting of content to minors and requires verifiable parental consent for account creation by minors.
Mechanism of Influence: Platforms must implement mechanisms to determine user age and obtain parental consent, and must alter recommendation systems for users identified as minors. This may require age assessment and changes to recommender algorithms.
Evidence:
Ambiguity Notes: The bill does not specify the technical means of age verification or parental consent, nor does it require risk assessments, detection/scanning, reporting, or data preservation. It is ambiguous whether any scanning or surveillance of private communications is required.
Why Relevant: The bill includes an age verification/parental consent requirement, which is a strong signal of a spiritual successor element.
Mechanism of Influence: Platforms must implement age-gating to verify users' ages and collect parental consent for minors, which may have implications for user privacy and access.
Evidence:
Ambiguity Notes: It is not explicit if this entails biometric or document-based age verification, nor how robust the verification must be.
Legislation ID: 52887
Bill URL: View Bill
This bill establishes requirements for age verification on websites that contain material deemed harmful to minors. It includes definitions of key terms, sets forth the obligations of commercial entities, outlines data privacy protections, and provides for enforcement mechanisms by the attorney general as well as a private right of action for individuals affected by violations.
Date | Action |
---|---|
2025-03-03 | Introduction and first reading |
Why Relevant: The bill imposes a duty on commercial entities to verify user age before access to harmful material, which is a core feature of the EU Chat Act's risk mitigation approach (age verification/assessment).
Mechanism of Influence: Websites must implement technical measures to verify user age, likely requiring upload of documents or use of third-party databases, affecting user privacy and access.
Evidence:
Ambiguity Notes: The term 'commercially available database or other approved methods' is broad and could encompass a range of verification technologies, but the bill does not specify biometric or persistent identification.
Legislation ID: 74402
Bill URL: View Bill
This bill proposes regulations for social media platforms concerning minors aged 15 and younger. It mandates anonymous age verification for platforms that may expose minors to harmful content, establishes requirements for account management based on age, and outlines penalties for non-compliance. The bill seeks to protect young users from the risks associated with social media usage.
Date | Action |
---|---|
2025-03-17 | Introduction and first reading |
2025-03-17 | Referred to Commerce and Consumer Protection |
Why Relevant: The bill directly mandates age verification for access to social media and content deemed harmful to minors.
Mechanism of Influence: Platforms must implement anonymous age verification for all users to prevent access by minors to certain content, and must manage accounts based on verified age.
Evidence:
Ambiguity Notes: The term 'anonymous age verification' is not defined in detail, so practical implementation may vary; unclear if this involves third-party services or in-house solutions.
Why Relevant: The bill requires platforms to manage accounts based on user age, including parental consent and account termination.
Mechanism of Influence: Platforms must prohibit account creation for under-14s and require parental consent for 14- and 15-year-olds, with strict account termination timelines.
Evidence:
Ambiguity Notes: No explicit technical requirements for how age is verified or how accounts are terminated.
Why Relevant: The bill sets out enforcement and penalty provisions for non-compliance.
Mechanism of Influence: Attorney general can impose penalties and minors can sue for damages; creates strong incentives for compliance.
Evidence:
Ambiguity Notes: Penalties are civil, not criminal; does not specify technical audit or oversight processes.
Legislation ID: 31331
Bill URL: View Bill
This bill addresses the growing concern of human trafficking and child exploitation, particularly in the context of the internet. It seeks to implement filters on internet-enabled devices that block access to websites facilitating trafficking and child exploitation. The bill also outlines definitions related to trafficking, adult entertainment, and obscene materials, and sets forth requirements for retailers of internet-enabled devices to ensure compliance with these new regulations.
Date | Action |
---|---|
2025-01-27 | Introduction and first reading |
Why Relevant: The bill imposes a duty on retailers to ensure internet-enabled devices are equipped with filters that block access to certain harmful online content, including child pornography and trafficking sites.
Mechanism of Influence: Mandates technical measures (filters) at the device level, targeting access to specific categories of illegal or harmful content.
Evidence:
Ambiguity Notes: The bill does not specify the technical details of the filters, how 'websites known for facilitating human trafficking' are determined, or the scope of the blocking (e.g., all web traffic or just browsers).
Why Relevant: The bill requires a reporting mechanism for consumers to flag sites that are not blocked by the filter.
Mechanism of Influence: Creates a feedback loop for ongoing maintenance of the blocklist, but does not mandate reporting to law enforcement or a central authority.
Evidence:
Ambiguity Notes: Unclear if reports are routed to authorities or only used to update filters.
Why Relevant: The bill addresses online child exploitation and trafficking, with explicit device-level countermeasures.
Mechanism of Influence: Focuses on device-level access control rather than platform/provider risk assessments, scanning, or moderation.
Evidence:
Ambiguity Notes: No requirements for providers/platforms to scan, assess, or report; no mention of end-to-end encryption, app stores, or user age verification.
Legislation ID: 14660
Bill URL: View Bill
House Bill No. 701 aims to define and penalize the distribution of child exploitation materials by commercial entities. It holds these entities liable for damages and attorney fees if they knowingly publish or distribute obscene material or content promoting child sexual exploitation. The bill also clarifies the responsibilities of internet service providers and allows the Attorney General to seek injunctive relief against violators. Additionally, it outlines the penalties for individuals involved in the distribution or possession of child sexual exploitation devices or images, ensuring that victims can seek damages in court.
Date | Action |
---|---|
2025-02-04 | (H) Died In Committee |
2025-01-15 | (H) Referred To Judiciary A |
Why Relevant: The bill addresses the liability of commercial entities for distributing or publishing child sexual exploitation materials online and allows for civil and injunctive remedies.
Mechanism of Influence: It imposes civil penalties and enables injunctive relief against platforms that knowingly host or distribute such materials.
Evidence:
Ambiguity Notes: The bill requires knowledge of the content for liability, focusing on entities that 'knowingly publish or distribute.' There is no explicit duty to proactively detect or scan for such materials.
Why Relevant: The Attorney General is empowered to seek injunctive relief and develop compliance guidance.
Mechanism of Influence: This allows for court orders to stop distribution or require removal of illegal content.
Evidence:
Ambiguity Notes: There is no mention of mandatory removal timelines, blocking, or detection technologies.
Legislation ID: 16119
Bill URL: View Bill
Senate Bill No. 2501 establishes civil liabilities and penalties for commercial entities and individuals involved in the distribution or possession of obscene materials and child sexual exploitation devices or images. It empowers the Attorney General to seek injunctive relief and outlines the definitions and procedures for enforcement of the act.
Date | Action |
---|---|
2025-02-04 | (S) Died In Committee |
2025-01-20 | (S) Referred To Judiciary, Division A |
Why Relevant: The bill addresses online child sexual exploitation and imposes liability on commercial entities for distributing such content.
Mechanism of Influence: Entities that knowingly publish or distribute prohibited content can be sued for damages and injunctive relief.
Evidence:
Ambiguity Notes: The duty is limited to 'knowingly' distributing content and does not create proactive obligations to detect, prevent, or report such content.
Legislation ID: 106968
Bill URL: View Bill
This bill introduces regulations concerning the participation of minors in content creation on social media platforms, defining key terms and outlining requirements for content creators. It mandates record-keeping, compensation structures, and allows minors to request the removal of content featuring them. Additionally, it establishes penalties for unlawful content involving minors and requires social media platforms to implement strategies to mitigate risks associated with such content.
Date | Action |
---|---|
2025-04-10 | Referred: Rules - Administrative(H) |
2025-04-02 | HCS Reported Do Pass (H) - AYES: 8 NOES: 0 PRESENT: 0 |
2025-04-01 | Executive Session Completed (H) |
2025-04-01 | HCS Voted Do Pass (H) |
2025-03-12 | Public Hearing Completed (H) |
2025-02-25 | Referred: Commerce(H) |
2025-02-11 | Read Second Time (H) |
2025-02-10 | Introduced and Read First Time (H) |
Why Relevant: The bill requires platforms to 'develop strategies to mitigate risks related to the monetization of restricted material involving minors' and to 'use systems to identify restricted material.' This echoes risk assessment and mitigation planning duties.
Mechanism of Influence: Platforms must proactively create and implement policies and technical systems to detect and address risky or unlawful content involving minors, which is a form of ongoing risk assessment and mitigation.
Evidence:
Ambiguity Notes: The bill does not specify the exact nature of the 'systems' or 'strategies' required, nor does it detail technical standards, error rates, or mandate specific technologies for detection.
Why Relevant: There is a requirement for platforms to use systems to identify restricted material, which may involve content scanning or detection.
Mechanism of Influence: Platforms could be compelled to use automated or manual review systems to detect restricted material involving minors, potentially implicating privacy if applied broadly.
Evidence:
Ambiguity Notes: No explicit mention of mandatory scanning of private communications, hashes, or 'state of the art' technology. The scope appears limited to monetized and public content.
Why Relevant: The bill allows minors to request removal or editing of content featuring them, and platforms must notify creators and comply within set timeframes. This is a takedown/removal mechanism.
Mechanism of Influence: Mandates removal of content upon request by minors, with a 72-hour window for compliance, and platform notification duties.
Evidence:
Ambiguity Notes: No provision for blocking orders or content filtering at the network level; applies specifically to content featuring minors upon their request.
Legislation ID: 104347
Bill URL: View Bill
House Bill No. 236 amends chapter 537 of the Revised Statutes of Missouri by adding a new section that holds commercial entities liable if they fail to implement reasonable age verification for minors accessing harmful material online. It defines various terms related to the distribution of such material and outlines the liabilities and exceptions for news organizations and internet service providers.
Date | Action |
---|---|
2025-05-12 | Dropped from Calendar - Pursuant to House Rules (H) |
2025-04-24 | Placed on the Informal Third Reading Calendar (H) |
2025-04-22 | Placed Back on Third Reading Calendar (H) |
2025-04-02 | Placed on the Informal Third Reading Calendar (H) |
2025-03-27 | Placed Back on Third Reading Calendar (H) |
2025-03-05 | Placed on the Informal Third Reading Calendar (H) |
2025-02-26 | Title of Bill - Agreed To |
2025-02-26 | HCS Adopted (H) |
Why Relevant: The bill imposes a duty on certain online content providers to verify the age of users before granting access to material harmful to minors.
Mechanism of Influence: It directly mandates age verification for access to specified online content, requiring commercial entities to implement technological or procedural checks to determine user age.
Evidence:
Ambiguity Notes: 'Reasonable age-verification methods' is not further defined, leaving open what specific technologies or processes would be required. No explicit mention of biometric or government ID checks, but could be interpreted broadly.
Legislation ID: 105762
Bill URL: View Bill
This bill amends chapter 407 of the Revised Statutes of Missouri by adding a new section that imposes requirements on commercial entities that publish or distribute material harmful to minors. It defines key terms related to the distribution of such material and mandates that entities implement reasonable age-verification methods to ensure that only individuals aged eighteen or older can access this content. Furthermore, it outlines the liabilities for entities that fail to comply with these regulations and specifies exceptions for news-gathering organizations.
Date | Action |
---|---|
2025-05-15 | Referred: Emerging Issues(H) |
2025-01-09 | Read Second Time (H) |
2025-01-08 | Read First Time (H) |
2024-12-27 | Prefiled (H) |
Why Relevant: The bill mandates 'reasonable age-verification methods' for access to certain online content.
Mechanism of Influence: Commercial entities must verify user age before granting access to material harmful to minors, directly imposing an age-gating requirement.
Evidence:
Ambiguity Notes: The definition of 'reasonable age-verification methods' is not specified in detail, leaving implementation open to interpretation.
Why Relevant: The bill prohibits retention of identifying information after access is granted.
Mechanism of Influence: This provision is privacy-protective and limits data retention by entities performing age checks.
Evidence:
Ambiguity Notes: It is not clear what constitutes 'identifying information' or how compliance will be monitored.
Legislation ID: 214247
Bill URL: View Bill
This bill introduces regulations concerning the creation and monetization of content featuring minors on social media. It establishes definitions of key terms, outlines the responsibilities of content creators, and mandates the establishment of trust accounts for minors earnings. Additionally, it provides minors with the right to request the deletion or modification of content featuring them and imposes restrictions on the use of restricted materials involving minors.
Date | Action |
---|---|
2025-03-27 | Second Read and Referred S General Laws Committee |
2025-02-25 | S First Read |
Why Relevant: The bill requires social media platforms to 'develop strategies to mitigate risks related to the monetization of restricted materials involving minors' and to 'document and reassess strategies annually.' This echoes risk assessment and mitigation plan requirements seen in the EU Chat Act.
Mechanism of Influence: Platforms must implement and annually update policies to address risks, which could include changes to moderation, content review, or monetization practices.
Evidence:
Ambiguity Notes: The bill does not specify what constitutes sufficient mitigation or the scope of risks to be assessed, leaving implementation details open to interpretation.
Why Relevant: The bill imposes a duty on platforms to facilitate and notify content removal requests by minors and on creators to comply, somewhat analogous to takedown/removal orders, though only for content featuring minors themselves.
Mechanism of Influence: Platforms and creators must act on removal requests within specific timelines, requiring internal processes and compliance infrastructure.
Evidence:
Ambiguity Notes: This is narrower than general CSAM or solicitation takedowns, as it is limited to requests by featured minors, not general detection or reporting.
Legislation ID: 37845
Bill URL: View Bill
House Bill 408 introduces measures to limit minors access to obscene content via the internet by mandating filters on devices activated in Montana. The bill outlines the responsibilities of manufacturers and individuals regarding these filters, including penalties for violations and the ability for parents or guardians to take legal action against manufacturers or individuals who disable these filters.
Date | Action |
---|---|
2025-05-22 | (H) Died in Process |
2025-04-25 | (S) Scheduled for 2nd Reading |
2025-04-25 | (S) 2nd Reading Indefinitely Postponed |
2025-04-25 | (S) 2nd Reading Pass Motion Failed |
2025-04-24 | (S) Taken from Committee; Placed on 2nd Reading |
2025-04-23 | (S) Motion Failed |
2025-04-08 | (S) Tabled in Committee |
2025-04-03 | (S) Hearing |
Why Relevant: The bill mandates that devices must have a filter to block obscene content for minors and that the filter must be enabled by default.
Mechanism of Influence: This is a form of compelled access control, requiring manufacturers to implement technical measures to restrict content for minors.
Evidence:
Ambiguity Notes: The bill does not specify the technical standards or scope of the filter beyond 'obscene content,' nor does it mention scanning for CSAM specifically.
Why Relevant: Users must provide their age during activation, and the filter is enabled for minors.
Mechanism of Influence: This is an age-assessment requirement at device activation.
Evidence:
Ambiguity Notes: It is not clear if robust age verification is required, or if self-declaration suffices.
Why Relevant: Manufacturers are held liable if their devices do not enable the required filter and a minor accesses obscene content.
Mechanism of Influence: Creates a duty for manufacturers to ensure compliance and exposes them to civil liability.
Evidence:
Ambiguity Notes: Liability is tied to actual access of obscene content, not just failure to enable the filter.
Why Relevant: The bill allows parents/guardians to sue manufacturers and individuals for disabling filters.
Mechanism of Influence: Provides a private right of action for enforcement.
Evidence:
Ambiguity Notes: None
Why Relevant: The attorney general is authorized to take action and seek penalties and injunctions.
Mechanism of Influence: Enables government enforcement of the filtering requirement.
Evidence:
Ambiguity Notes: None
Legislation ID: 170692
Bill URL: View Bill
The Social Media Youth Protection Act aims to safeguard the well-being and privacy of minors in Montana by mandating social media companies to adopt various measures. These include implementing an age-assurance system to verify the age of account holders, providing supervisory tools for minors accounts, and requiring parental consent for any changes to data privacy settings. The Act also outlines specific requirements for the protection of minors personal information and establishes penalties for non-compliance.
Date | Action |
---|---|
2025-05-22 | (H) Died in Standing Committee |
2025-04-24 | (S) Tabled in Committee |
2025-04-07 | (S) Hearing |
2025-04-07 | (S) First Reading |
2025-04-07 | (S) Hearing Canceled |
2025-04-07 | (S) Referred to Committee |
2025-04-05 | (H) Transmitted to Senate |
2025-04-05 | (H) Scheduled for 3rd Reading |
Why Relevant: The bill mandates 'age-assurance systems' to verify if account holders are minors, with a specified 95% accuracy rate.
Mechanism of Influence: Requires platforms to reliably identify minor users, which is a core element of the EU Chat Act's approach to risk mitigation and age gating.
Evidence:
Ambiguity Notes: The term 'age-assurance system' is not defined in technical detail; does not specify biometric or document-based verification, nor does it require age assessment for all users (e.g., adults).
Why Relevant: Requires parental consent for changes to privacy settings and presumes confidentiality of minor information.
Mechanism of Influence: Imposes privacy-protective defaults for minors and restricts data use without parental consent, similar to some child-protection provisions in the EU Chat Act.
Evidence:
Ambiguity Notes: Does not require proactive content scanning or reporting; focuses on privacy rather than detection.
Why Relevant: Mandates supervisory tools for designated individuals to manage minors' social media usage, including time limits and usage data visibility.
Mechanism of Influence: Requires platforms to build and maintain infrastructure for parental oversight, echoing risk mitigation and supervisory tool requirements in some EU proposals.
Evidence:
Ambiguity Notes: Does not specify technical requirements for these tools or whether they must monitor content or communications.
Why Relevant: Mandates privacy and security measures for minor accounts, including the right to request deletion of personal information.
Mechanism of Influence: Creates additional compliance infrastructure and user rights for minors, paralleling some child protection and data control provisions.
Evidence:
Ambiguity Notes: No requirement for proactive detection, risk assessment, or reporting obligations.
SB 488 amends several sections of the Montana Code Annotated to include provisions against false, misleading, or deceptive consumer reviews and testimonials as unfair methods of competition. It establishes a statute of limitations for the Department of Justice to initiate actions against such practices and clarifies the timeframes for individuals to file complaints. The bill also includes provisions for the enforcement of age verification for materials harmful to minors and sets guidelines for damages and attorney fees related to violations of consumer protection laws.
Date | Action |
---|---|
2025-04-18 | Chapter Number Assigned |
2025-04-17 | (S) Signed by Governor |
2025-04-08 | (H) Signed by Speaker |
2025-04-08 | (S) Transmitted to Governor |
2025-04-07 | (S) Signed by President |
2025-04-02 | (S) Returned from Enrolling |
2025-04-01 | (H) Scheduled for 3rd Reading |
2025-04-01 | (S) Sent to Enrolling |
Why Relevant: Contains a section requiring commercial entities to implement age verification for access to material harmful to minors online.
Mechanism of Influence: Imposes a statutory duty on commercial websites to use 'reasonable age verification methods' before allowing access to certain material, with penalties for violations.
Evidence:
Ambiguity Notes: The specific technologies or standards for 'reasonable age verification' are not detailed, and the definition of 'material harmful to minors' may depend on existing state law, which could be broad or narrow.
Legislation ID: 121780
Bill URL: View Bill
LB504A appropriates specific amounts from the General Fund to the Attorney Generals office for the fiscal years 2025-26 and 2026-27. The bill ensures that funds are allocated to aid in the execution of the provisions outlined in Legislative Bill 504, with explicit limits on salary expenditures for both fiscal years.
Date | Action |
---|---|
2025-06-02 | Approved by Governor on May 30, 2025 |
2025-05-28 | President/Speaker signed |
2025-05-28 | BosnFA235withdrawn |
2025-05-28 | Returned to Select File for specific amendment |
2025-05-28 | Passed on Final Reading 42-7-0 |
2025-05-28 | BosnFA333filed |
2025-05-28 | BosnFA333adopted |
2025-05-28 | Bill stands indefinitely postponed |
Why Relevant: The bill mandates designation of compliance officers and reporting mechanisms, which are compliance infrastructure elements.
Mechanism of Influence: Covered online services must appoint compliance officers and set up reporting tools for harms or violations.
Evidence:
Ambiguity Notes: Does not specify the scope or detail of compliance infrastructure beyond designation and reporting.
Why Relevant: The Act requires parental controls and user tools for privacy and interaction management, which are mitigation features aimed at protecting minors.
Mechanism of Influence: Online services must provide tools for parents and minors to control privacy, account settings, and communications.
Evidence:
Ambiguity Notes: No explicit requirement for risk assessments or ongoing risk mitigation plans.
Why Relevant: The bill restricts personal data collection and use, and mandates deletion after age verification, which touches on privacy and data minimization.
Mechanism of Influence: Services must delete personal data after age verification and only collect the minimum necessary data.
Evidence:
Ambiguity Notes: No detailed provisions on age verification technology or its mandatory use for all users.
Legislation ID: 201066
Bill URL: View Bill
Assembly Bill 294 mandates that operators of online platforms, websites, or services that primarily offer material harmful to minors must implement an age verification system to prevent minors from accessing such content. The bill outlines the requirements for the age verification process, civil penalties for violations, and allows parents or guardians to take legal action against violators. It aims to safeguard minors from inappropriate material while establishing clear guidelines for online service providers.
Date | Action |
---|---|
2025-04-12 | (Pursuant to Joint Standing Rule No. 14.3.1, no further action allowed.) |
2025-02-26 | From printer. To committee. |
2025-02-25 | Read first time. Referred to Committee on Commerce and Labor. To printer. |
Why Relevant: The bill includes a clear mandate for age verification systems for access to certain online content, which is a core element of the EU Chat Act's approach to child protection online.
Mechanism of Influence: Operators must verify users' ages using government ID, transactional data, or third-party services before granting access to harmful material, directly affecting user privacy and access.
Evidence:
Ambiguity Notes: The scope is limited to platforms 'primarily offering material harmful to minors,' which may limit the breadth compared to the EU Chat Act. The bill does not specify technical standards for verification or address broader platform risk assessments or content moderation duties.
Legislation ID: 113555
Bill URL: View Bill
This bill mandates that manufacturers of electronic tablets and smartphones install filters that restrict minors from accessing obscene content online. It establishes civil and criminal liabilities for manufacturers and individuals who intentionally disable these filters to allow minors access to such material. The bill also provides a private right of action for parents or guardians if a minor accesses obscene material due to a violation of these provisions.
Date | Action |
---|---|
2025-03-03 | Executive Session: 03/03/2025 10:00 am LOB 206-208 |
2025-03-03 | Retained in Committee |
2025-02-05 | Public Hearing: 02/05/2025 11:00 am LOB 206-208 |
2025-01-08 | Introduced 01/08/2025 and referred to JudiciaryHJ 2P. 13 |
Why Relevant: The bill mandates device-level filters to restrict minors' access to obscene material, with automatic enabling for users identified as minors. This is a form of age gating and content restriction.
Mechanism of Influence: Requires age input at device activation and filters to be automatically enabled for minors. Non-minor users can deactivate filters with a password. Manufacturers must ensure compliance or face liability.
Evidence:
Ambiguity Notes: The bill does not specify the technical means of age verification beyond self-reported age at activation, nor does it require ongoing risk assessment, reporting, or scanning for specific content beyond the general filter.
Why Relevant: The bill imposes liability on manufacturers for failing to install or enable required filters, and on individuals who disable filters for minors.
Mechanism of Influence: Legal and financial consequences for noncompliance create incentives for manufacturers to implement and maintain filtering technology.
Evidence:
Ambiguity Notes: No mention of risk assessments, mitigation plans, or compliance infrastructure beyond the filter itself.
Why Relevant: The bill requires age input at device activation to determine whether to enable filters, which is a form of age gating/assessment.
Mechanism of Influence: Age is self-reported by the user at activation, triggering automatic filter activation for minors.
Evidence:
Ambiguity Notes: No requirement for robust or verifiable age verification; relies on user input.
Legislation ID: 49633
Bill URL: View Bill
This bill prohibits social media platforms from promoting behaviors that could lead to eating disorders among users under 18 years old. It defines key terms related to social media and eating disorders and establishes requirements for platforms to audit their practices. Violations can result in substantial civil penalties.
Date | Action |
---|---|
2025-02-25 | Introduced in the Senate, Referred to Senate Health, Human Services and Senior Citizens Committee |
Why Relevant: The bill requires platforms to conduct both annual independent audits and quarterly internal audits of their practices as they relate to eating disorders and child users.
Mechanism of Influence: Mandates ongoing risk assessment and auditing of platform features and content that could contribute to eating disorders in minors, and requires corrective action if harmful practices are identified.
Evidence:
Ambiguity Notes: The audit requirement is specific to eating disorder risks, not general online harms or CSAM. No explicit mention of risk mitigation plans beyond 'corrective action.'
Why Relevant: Mandates corrective action within 30 days of identifying harmful practices related to eating disorders.
Mechanism of Influence: Platforms must change features or moderation practices if audits or internal reviews find risks to children related to eating disorders.
Evidence:
Ambiguity Notes: Corrective action is limited to eating disorder-related risks, not broader online harms. No explicit detail on what mitigation must entail.
This bill establishes the Digital Age Verification Act, which mandates that covered manufacturers take reasonable steps to determine the age of users on their devices and applications. It outlines specific requirements for age verification processes and parental consent for minors, ensuring that digital environments are safer for younger users. The act also provides enforcement mechanisms through existing trade practice laws.
Date | Action |
---|---|
2025-02-05 | Sent to HCPAC - Referrals: HCPAC/HJC |
Why Relevant: The bill directly mandates age verification and parental consent mechanisms for devices and apps, including application stores, which is a core element of the EU Chat Act approach.
Mechanism of Influence: Manufacturers and app stores are required to reliably determine or estimate user age and signal this information digitally, and to obtain parental consent for minors. This is likely to require robust age assessment infrastructure and could have implications for user privacy.
Evidence:
Ambiguity Notes: The bill does not specify the technical means for age estimation or verification, nor does it address ongoing risk assessments, detection/scanning, or data retention. The requirement for a 'digital signal indicating the user's age range' is somewhat open to interpretation regarding implementation.
Why Relevant: The Act imposes obligations on app stores—another strong signal from the Chat Act model.
Mechanism of Influence: App stores are required to implement parental consent mechanisms and integrate with device-level age signaling, which could restrict access to certain apps based on user age.
Evidence:
Ambiguity Notes: The scope of app store duties is limited to age gating and parental consent, without reference to broader risk assessments or content moderation duties.
This bill enacts the Protection of Minors from Distribution of Harmful Material Act, which defines harmful material and outlines the responsibilities of commercial entities in preventing minors from accessing such content. It establishes liability for damages and provides a private right of action for individuals aggrieved by violations of the act.
Date | Action |
---|---|
2025-01-22 | Sent to HCEDC - Referrals: HCEDC/HJC |
Why Relevant: The bill creates a binding duty for commercial entities to implement age verification before granting access to certain online content for minors.
Mechanism of Influence: Commercial entities must use approved methods to reliably verify user age before permitting access to material harmful to minors. Failure results in liability and potential lawsuits.
Evidence:
Ambiguity Notes: 'Reasonable age verification' is defined, and methods are listed, but the scope of 'commercial entity' and 'harmful material' may be subject to interpretation.
Legislation ID: 59276
Bill URL: View Bill
This legislation introduces new regulations under the general business law, specifically requiring commercial entities that publish or distribute sexual material harmful to minors to implement reasonable age verification methods. It outlines definitions, procedures for verification, and penalties for noncompliance, while providing exceptions for bona fide news organizations and internet service providers.
Date | Action |
---|---|
2025-05-19 | held for consideration in consumer affairs and protection |
2025-01-30 | referred to consumer affairs and protection |
Why Relevant: The bill imposes a duty on covered commercial entities to verify the age of users before granting access to material deemed harmful to minors.
Mechanism of Influence: Entities must implement technical measures (digital ID, credit card, or verified accounts) to reliably determine user age, thereby restricting minors’ access to certain online content.
Evidence:
Ambiguity Notes: 'Reasonable age verification methods' are described with some specificity (digital ID, credit card, password-protected accounts), but there is no mandate for ongoing risk assessments, scanning, or broader platform duties beyond age gating.
Legislation ID: 61589
Bill URL: View Bill
This bill introduces new regulations under the general business law to safeguard children under eighteen years of age using online products. It establishes definitions relevant to child users and mandates that online products targeted towards children incorporate features that allow parents to limit screen time and prevent harmful content. Additionally, it grants the attorney general the authority to enforce compliance and impose penalties for violations.
Date | Action |
---|---|
2025-02-12 | referred to consumer affairs and protection |
Why Relevant: The bill imposes platform duties on online products aimed at children, requiring them to implement screen time limitation features and restrict certain engagement mechanisms.
Mechanism of Influence: Providers must add technical parental controls and may be compelled to remove or alter engagement features if the bureau deems them inappropriate for children. The bill also prohibits promoting harmful/illegal activities to children.
Evidence:
Ambiguity Notes: 'Engagement features' and 'harmful or illegal activities' are not precisely defined, giving the bureau broad discretion. No explicit risk assessment or mitigation planning is required.
Why Relevant: The enforcement mechanism places compliance duties on providers, with penalties for non-compliance and a cure period after notice from the attorney general.
Mechanism of Influence: Providers must monitor compliance and respond to enforcement actions, potentially changing product features or policies to avoid penalties.
Evidence:
Ambiguity Notes: No mention of specific technical compliance infrastructure, reporting, or audit requirements.
Why Relevant: The bill defines 'child,' 'child user,' 'online product,' and what it means for a product to be targeted towards children, clarifying scope.
Mechanism of Influence: Defines which products and users fall under the law's requirements.
Evidence:
Ambiguity Notes: Definitions may be broad, but do not impose technical or ongoing risk assessment duties.
Legislation ID: 64455
Bill URL: View Bill
This bill amends the general business law to introduce the New York Childrens Online Safety Act, which includes provisions for privacy by default, parental approvals, and protections against manipulative design practices (dark patterns) on online platforms used by minors. It mandates age verification for users, default privacy settings for minors, and parental oversight for connections and financial transactions involving minors. The bill also outlines the authority of the attorney general to enforce these regulations and provides remedies for violations.
Date | Action |
---|---|
2025-05-27 | reported referred to codes |
2025-03-06 | referred to consumer affairs and protection |
Why Relevant: The bill mandates 'commercially reasonable age verification' for users of covered platforms and requires default privacy settings for minors, with parental controls.
Mechanism of Influence: Operators must verify user age and set privacy defaults preventing unconnected users from contacting minors. Parental approval is required for connections and transactions involving users under 13.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification, nor does it require the use of specific technologies (e.g., facial recognition, document uploads).
Why Relevant: The bill prohibits 'dark patterns,' which are manipulative designs that could undermine user autonomy or privacy.
Mechanism of Influence: Operators must not use interface designs that mislead or inhibit minors from exercising rights under the Act.
Evidence:
Ambiguity Notes: The term 'dark patterns' is broad and could be subject to interpretation in enforcement.
Why Relevant: The bill assigns enforcement and rulemaking authority to the Attorney General.
Mechanism of Influence: The Attorney General can create rules, investigate violations, and seek penalties or restitution.
Evidence:
Ambiguity Notes: The scope of rules could expand obligations depending on future regulations.
Legislation ID: 166829
Bill URL: View Bill
The New York Artificial Intelligence Act aims to regulate AI systems that significantly impact individuals rights and opportunities. It addresses algorithmic discrimination, mandates developer and deployer responsibilities, and introduces auditing and reporting requirements for high-risk AI systems. The Act emphasizes the need for transparency, oversight, and the protection of vulnerable populations from potential harms associated with AI technologies.
Date | Action |
---|---|
2025-06-11 | reference changed to ways and means |
2025-06-09 | referred to science and technology |
Why Relevant: The Act imposes risk assessment and mitigation requirements on developers and deployers of high-risk AI systems, requiring audits and risk prevention.
Mechanism of Influence: Developers and deployers must 'conduct independent audits' and 'prevent foreseeable risks' associated with high-risk AI, which functionally requires ongoing risk assessment and mitigation planning.
Evidence:
Ambiguity Notes: The terms 'independent audits' and 'prevent foreseeable risks' are broad and could be interpreted to include a range of compliance activities, but the law does not specify the exact mechanisms or frequency of assessments.
Legislation ID: 69375
Bill URL: View Bill
This legislation proposes to amend the general business law by introducing a new section that mandates commercial entities operating internet pornography websites to implement age verification methods. The goal is to ensure that individuals attempting to access such material are at least eighteen years old, thereby protecting minors from exposure to sexual content deemed harmful. The bill outlines definitions, requirements for age verification, penalties for non-compliance, and exceptions for news organizations.
Date | Action |
---|---|
2025-05-05 | DEFEATED IN INTERNET AND TECHNOLOGY |
2025-05-05 | REPORTED AND COMMITTED TO CODES |
2025-03-21 | NOTICE OF COMMITTEE CONSIDERATION - REQUESTED |
2025-01-28 | REFERRED TO INTERNET AND TECHNOLOGY |
Why Relevant: The bill creates a binding duty for commercial pornography sites to verify the age of users before granting access to adult material.
Mechanism of Influence: Mandates 'age verification' (via digital ID, credit card, or password-protected account) and IP address checks as a precondition for access. This is a core feature of the EU Chat Act and similar online child-protection regimes.
Evidence:
Ambiguity Notes: The bill specifies verification methods but does not require risk assessments, mitigation plans, or provider scanning for CSAM. Applicability is limited to commercial pornography sites.
Legislation ID: 70393
Bill URL: View Bill
This bill amends the general business law to introduce the New York Childrens Online Safety Act, which aims to provide a safer online environment for minors. It includes provisions for age verification, privacy settings, parental consent for connections and financial transactions, and prohibits manipulative design practices. The bill also grants the Attorney General authority to enforce these regulations and outlines remedies for violations.
Date | Action |
---|---|
2025-02-10 | REFERRED TO INTERNET AND TECHNOLOGY |
Why Relevant: The bill explicitly requires age verification ('Operators must verify user age using commercially reasonable methods.') and mandates privacy-by-default settings for minors.
Mechanism of Influence: Operators must implement technical means to assess the age of users and enforce privacy restrictions, directly affecting platform operations and user access.
Evidence:
Ambiguity Notes: The term 'commercially reasonable methods' for age verification is broad and could range from self-declaration to more intrusive identity checks.
Why Relevant: Parental approval is mandated for connections and financial transactions for minors under 13.
Mechanism of Influence: Operators must build and maintain mechanisms for parental consent and monitoring, altering how children interact online.
Evidence:
Ambiguity Notes: The scope of 'connections' and 'transactions' may be interpreted variably by platforms.
Why Relevant: The act prohibits manipulative design practices ('dark patterns') that inhibit user autonomy.
Mechanism of Influence: Operators are required to review and possibly redesign interfaces to ensure compliance.
Evidence:
Ambiguity Notes: What constitutes a 'dark pattern' may be subject to interpretation and rulemaking.
Why Relevant: The Attorney General has explicit rulemaking and enforcement authority.
Mechanism of Influence: Centralizes compliance oversight and enables enforcement through civil penalties and injunctions.
Evidence:
Ambiguity Notes: The extent of regulatory detail and enforcement discretion is left to future rulemaking.
Legislation ID: 162427
Bill URL: View Bill
House Bill 301 introduces the "Social Media Protection for Minors Act," which mandates social media platforms to implement age verification methods and restrict account creation for minors under 16 years. The bill outlines specific requirements for platforms regarding account management for minors, including parental consent for those aged 14 and 15, and establishes penalties for non-compliance.
Date | Action |
---|---|
2025-05-07 | Passed 1st Reading |
2025-05-07 | Ref To Com On Rules and Operations of the Senate |
2025-05-07 | Regular Message Sent To Senate |
2025-05-07 | Regular Message Received From House |
2025-05-06 | Passed 3rd Reading |
2025-05-06 | Passed 2nd Reading |
2025-05-05 | Cal Pursuant Rule 36(b) |
2025-05-05 | Placed On Cal For 05/06/2025 |
Why Relevant: The bill mandates 'age verification methods' for social media platforms to ensure users are 16 or older, and requires parental consent for ages 14 and 15.
Mechanism of Influence: Platforms must implement technical means to verify user age, potentially affecting onboarding processes and user privacy.
Evidence:
Ambiguity Notes: The bill allows both 'anonymous and standard' verification, but does not specify the exact technologies or verification standards, which could range from self-attestation to third-party checks.
Why Relevant: The bill prohibits account creation for minors under 14 and requires parental consent for minors aged 14 or 15.
Mechanism of Influence: Platforms must design and enforce account management and consent workflows, and terminate accounts of underage users, which may require collecting and processing parental information.
Evidence:
Ambiguity Notes: Does not detail how parental consent is to be obtained or verified, nor how platforms must authenticate ages beyond initial verification.
Why Relevant: Mandates deletion of personal information upon account termination for underage users.
Mechanism of Influence: Platforms must implement data deletion processes tied to age and account status.
Evidence:
Ambiguity Notes: Does not specify retention periods or technical standards for deletion.
This bill amends Article 10 of Chapter 143B of the General Statutes to create the North Carolina Artificial Intelligence Innovation Trust Fund. The fund will provide financial assistance for developing AI models and entrepreneurship programs, while also imposing restrictions on the use of AI for harmful purposes. It defines key terms related to AI safety and establishes requirements for developers of covered models to ensure compliance with safety protocols and prevent critical harms.
Date | Action |
---|---|
2025-03-26 | Ref To Com On Rules and Operations of the Senate |
2025-03-26 | Passed 1st Reading |
2025-03-26 | Withdrawn From Com |
2025-03-26 | Re-ref Com On Appropriations/Base Budget |
2025-03-25 | Filed |
Why Relevant: The bill imposes annual compliance reevaluation and mandates independent third-party assessments for developers of covered AI models.
Mechanism of Influence: Requires ongoing risk assessments and compliance reviews, similar to the EU Chat Act's risk assessment and mitigation obligations.
Evidence:
Ambiguity Notes: The compliance procedures are broadly defined and may cover a range of risks, but there is no explicit reference to child sexual abuse prevention or online platform-specific risks.
Why Relevant: Developers must report AI safety incidents within 72 hours and submit annual compliance statements to the Attorney General.
Mechanism of Influence: This creates mandatory reporting obligations, analogous to reporting requirements in the EU Chat Act, though not specific to CSAM.
Evidence:
Ambiguity Notes: The term 'AI safety incident' is not defined in relation to child safety or abuse material.
Why Relevant: Developers are required to maintain records of disclosures for seven years and provide updates to whistleblowers.
Mechanism of Influence: Establishes data preservation and transparency obligations, similar to compliance infrastructure requirements in the EU Chat Act.
Evidence:
Ambiguity Notes: Record-keeping is focused on internal disclosures and investigations, not user content or communications.
Why Relevant: Requires developers to evaluate risks before public or commercial release of covered models.
Mechanism of Influence: Mandates pre-release risk assessment, but not specifically for child abuse or online harms.
Evidence:
Ambiguity Notes: Scope of 'critical harm' is defined by the bill but not limited to child safety.
Legislation ID: 162997
Bill URL: View Bill
House Bill 805 establishes definitions for biological sex, limits state funding for gender transition procedures, and aims to prevent sexual exploitation through strict regulations on online pornography. The bill mandates that all state laws reflect a binary understanding of sex and introduces measures to protect minors and women from exploitation, including consent requirements for publishing pornographic images. Additionally, it allows students to be excused from discussions that conflict with their religious beliefs and ensures parental oversight of educational materials.
Date | Action |
---|---|
2025-07-29 | Veto Overridden |
2025-07-29 | Veto Received From House |
2025-07-29 | Placed on Todays Calendar |
2025-07-03 | Received from the Governor |
2025-07-03 | Placed On Cal For 07/29/2025 |
2025-07-03 | Vetoed 07/03/2025 |
2025-06-27 | Pres. To Gov. 6/27/2025 |
2025-06-26 | Ratified |
Why Relevant: This provision regulates online pornography platforms and introduces duties to verify age and consent before publishing images, as well as a removal process for non-consensual content. These are platform obligations related to preventing exploitation of minors online.
Mechanism of Influence: Operators of online platforms must implement age and consent verification for pornographic images, and maintain a process for removing non-consensual images. This could require identity or age verification technologies and content moderation procedures.
Evidence:
Ambiguity Notes: The bill does not specify the technical means for age verification (e.g., whether it must be robust, third-party, or document-based), nor does it mandate general scanning, risk assessments, or detection orders. The removal process is limited to non-consensual images, not broader CSAM detection or reporting.
This bill establishes the Social Media Control in Information Technology Act, which mandates that social media platforms provide clear disclosures regarding data collection and usage, particularly for minors. It requires platforms to implement user-friendly mechanisms for privacy rights, prohibits the use of minors data in algorithmic recommendations, and sets default privacy settings to protect young users. Additionally, it holds operators accountable for non-compliance and creates a registry for privacy policies.
Date | Action |
---|---|
2025-06-17 | Re-ref Com On Appropriations |
2025-06-17 | Reptd Fav Com Substitute |
2025-04-10 | Ref to the Com on Commerce and Economic Development, if favorable, Appropriations, if favorable, Rules, Calendar, and Operations of the House |
2025-04-10 | Passed 1st Reading |
2025-04-09 | Filed |
Why Relevant: The bill creates platform duties regarding minors' privacy and algorithmic use of their data.
Mechanism of Influence: Mandates that platforms set privacy-protective defaults for minors and prohibits algorithmic recommendations using minors' data.
Evidence:
Ambiguity Notes: The bill doesn't specify risk assessment, mitigation, or detection obligations—its focus is on privacy defaults and transparency, not active scanning or risk management.
Why Relevant: Enforcement provisions create accountability for compliance with minors' privacy protections.
Mechanism of Influence: Allows the Attorney General to monitor compliance and permits private lawsuits by minors.
Evidence:
Ambiguity Notes: No language about ongoing risk assessments, technical scanning, or compelled detection orders.
Why Relevant: Establishes a task force to oversee data privacy for minors, with annual reporting.
Mechanism of Influence: Creates a body to study and recommend changes on minors' social media privacy.
Evidence:
Ambiguity Notes: No operational authority for scanning, mitigation, or technical interventions.
Why Relevant: Mandates a registry of privacy policies for platforms.
Mechanism of Influence: Platforms must file privacy policy with the state for transparency.
Evidence:
Ambiguity Notes: No direct operational or technical controls on platform architecture.
Why Relevant: Provides digital rights for minors, including protection from manipulative design and transparency.
Mechanism of Influence: Requires platforms to shield minors from certain design features and explain platform impacts.
Evidence:
Ambiguity Notes: No language about technical risk assessments, detection, or mandatory scanning.
Why Relevant: Mandates user-friendly mechanisms for privacy rights, including correction and deletion.
Mechanism of Influence: Lets users (including minors) request correction or deletion of their data.
Evidence:
Ambiguity Notes: No requirements for data retention, preservation, or reporting to authorities.
Legislation ID: 93752
Bill URL: View Bill
This bill introduces a new section to the North Dakota Century Code that outlines the responsibilities of commercial entities regarding the publication and distribution of sexual material that may be harmful to minors. It defines key terms, establishes liability for failing to verify the age of individuals accessing such material, and allows for civil actions against entities that violate these provisions. It also includes exemptions for news organizations and protections for internet service providers.
Date | Action |
---|---|
2025-03-18 | Committee Hearing 02:30 (Industry and Business) |
2025-02-13 | Introduced, first reading, referred toIndustry and Business |
2025-02-10 | Received from House |
2025-02-07 | Second reading, passed, yeas 91 nays 0 |
2025-02-06 | Reported back, do pass, place on calendar 11 0 3 |
2025-02-04 | Committee Hearing 10:00 (Judiciary) |
2025-01-20 | Introduced, first reading, referred toJudiciary |
Why Relevant: The bill requires 'commercial entities' to implement 'reasonable age verification methods' to prevent minors from accessing sexual material harmful to minors.
Mechanism of Influence: Mandates age verification for access to certain online content, directly implicating user privacy and access rights.
Evidence:
Ambiguity Notes: The bill does not specify the technical means or standards for age verification, leaving room for interpretation as to what constitutes a 'reasonable' method.
Why Relevant: The bill prohibits retention of identifying information after verification.
Mechanism of Influence: This attempts to mitigate privacy risks from the age verification process.
Evidence:
Ambiguity Notes: Does not define what constitutes 'identifying information' or how compliance will be monitored.
Legislation ID: 93814
Bill URL: View Bill
This bill creates a new section in the North Dakota Century Code that holds covered platforms accountable for publishing or distributing sexual material harmful to minors. It outlines definitions, requirements for age verification, and penalties for violations, including the ability for parents or guardians to bring civil actions against platforms that fail to protect minors from accessing harmful content.
Date | Action |
---|---|
2025-02-17 | Second reading, failed to pass, yeas 36 nays 53 |
2025-02-14 | Amendment adopted, placed on calendar |
2025-02-13 | Reported back amended, do pass, amendment placed on calendar 11 3 0 |
2025-02-12 | Committee Hearing 09:00 (Judiciary) |
2025-01-20 | Introduced, first reading, referred toJudiciary |
Why Relevant: The bill imposes a binding duty on covered platforms to implement age verification for access to certain harmful content by minors.
Mechanism of Influence: Platforms must use 'reasonable age verification' (not just IP address) to block minors from sexual material deemed harmful. Failure exposes them to civil suits and substantial fines.
Evidence:
Ambiguity Notes: While 'reasonable age verification' is not exhaustively defined, the bill requires diligence and specifically excludes IP address-only checks. The scope is limited to sexual material harmful to minors, not general online child safety or CSAM.
Legislation ID: 94641
Bill URL: View Bill
This bill introduces a new section to the North Dakota Century Code, mandating that covered manufacturers and online services take necessary steps to verify the age of users to prevent minors from accessing mature content. It establishes penalties for non-compliance and outlines the responsibilities of manufacturers and service providers regarding age verification and content labeling.
Date | Action |
---|---|
2025-03-26 | Committee Hearing 09:00 (Judiciary) |
2025-02-21 | Introduced, first reading, referred toJudiciary |
2025-02-20 | Received from Senate |
2025-02-19 | Amendment adopted, placed on calendar |
2025-02-19 | Second reading, passed, yeas 46 nays 1 |
2025-02-18 | Reported back amended, do pass, amendment placed on calendar 5 0 0 |
2025-02-10 | Committee Hearing 10:00 (Industry and Business) |
2025-01-27 | Introduced, first reading, referred toIndustry and Business |
Why Relevant: The bill explicitly requires 'reasonable age verification methods' for access to sexual material harmful to minors.
Mechanism of Influence: Commercial entities must implement technical means to verify user ages prior to granting access to restricted material, or face civil liability.
Evidence:
Ambiguity Notes: 'Reasonable age verification methods' is not specifically defined, leaving room for interpretation regarding what technologies or processes are sufficient. No explicit mention of biometric, government ID, or third-party verification, nor any reference to proportionality or privacy safeguards beyond the non-retention clause.
Why Relevant: The bill prohibits retention of identifying information after age verification.
Mechanism of Influence: Entities must design systems so that any personal information collected for age verification is deleted after access is granted, which could affect the architecture of verification solutions.
Evidence:
Ambiguity Notes: No detail on what constitutes 'identifying information,' nor how compliance will be verified or enforced.
Legislation ID: 129262
Bill URL: View Bill
This bill proposes the enactment of section 1349.07 of the Revised Code, which mandates that application stores provide parental control options and obtain consent from parents before allowing children under 16 to download specific applications. It outlines definitions related to application stores, broadband services, and age determination, while establishing requirements for developers and manufacturers to facilitate parental oversight and compliance with age-related restrictions.
Date | Action |
---|---|
2025-04-09 | Referred to committee |
2025-04-07 | Introduced |
Why Relevant: The bill imposes a duty on application stores to obtain parental consent and estimate user age, which directly aligns with strong age verification/assessment signals.
Mechanism of Influence: App stores must implement mechanisms to verify age and obtain parental consent before allowing users under 16 to download apps (see 'must obtain parental consent for downloads by users under 16'). Manufacturers must estimate user age at device activation or update.
Evidence:
Ambiguity Notes: The bill does not specify the technical method for age estimation or verification, leaving open the possibility of intrusive or privacy-impacting implementations.
Why Relevant: The bill requires developers to assess if their apps are likely to be accessed by children and notify app stores, which is a form of risk assessment.
Mechanism of Influence: Developers must determine and notify whether their app is likely to be accessed by children before distribution ('Developers must determine if their application is likely to be accessed by children before distribution').
Evidence:
Ambiguity Notes: The criteria for 'likely to be accessed by children' may be subject to interpretation.
Why Relevant: The bill mandates parental control features, enabling parents to manage content and usage, which is a mitigation measure.
Mechanism of Influence: Developers must implement parental controls for managing linked accounts, content delivery, and usage time.
Evidence:
Ambiguity Notes: No specifics on technical design or privacy safeguards.
Legislation ID: 216592
Bill URL: View Bill
This bill enacts several sections of the Ohio Revised Code focusing on age verification and parental consent for applications. It defines key terms related to age categories, application types, and parental controls. It mandates that application distribution providers and developers implement measures to verify the age of users and obtain parental consent for minors accessing certain content. The bill outlines specific responsibilities for application distributors, developers, operating system providers, and internet browsers to ensure compliance with these requirements.
Date | Action |
---|---|
2025-05-28 | Referred to committee |
2025-05-27 | Introduced |
Why Relevant: The bill requires application distributors, developers, and OS/browser providers to implement age verification and parental controls, which are core elements of the EU Chat Act's risk mitigation and age-gating features.
Mechanism of Influence: Providers must request age declarations, offer age verification, and enable parental controls and content filtering, directly affecting how minors can access apps and content.
Evidence:
Ambiguity Notes: The bill does not specify technical methods for age verification or content filtering, leaving room for interpretation about intrusiveness or proportionality.
This bill amends existing sections of the Revised Code and introduces new sections to ensure that organizations verify the age of individuals accessing materials deemed obscene or harmful to juveniles. It prohibits the non-consensual use of an individuals likeness to create sexual images and allows victims to pursue legal action against offenders. The bill outlines the responsibilities of organizations regarding age verification methods and the handling of personal data.
Date | Action |
---|---|
2025-02-12 | Referred to committee |
2025-02-11 | Introduced |
Why Relevant: The bill contains explicit requirements for age verification for access to certain online materials, which is one of the core elements of the EU Chat Act.
Mechanism of Influence: Organizations must implement 'reasonable age verification methods' to ensure users are 18 or older before accessing obscene or harmful materials. They must block access to those whose age cannot be verified and notify users of verification failures.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification (e.g., biometric, document upload, third-party service), leaving room for interpretation.
Why Relevant: The bill requires organizations to delete identifying information after age verification, addressing data handling and privacy in the context of age checks.
Mechanism of Influence: Organizations must delete personal data used for age verification once the process is complete.
Evidence:
Ambiguity Notes: No mandate for broader data retention or preservation tied to detection or complaints.
Why Relevant: The bill requires organizations to implement a geolocation monitoring system for users attempting to access materials.
Mechanism of Influence: This could be used to restrict or monitor access based on user location, but is not directly tied to risk assessment or mitigation.
Evidence:
Ambiguity Notes: The scope and technical requirements for geolocation monitoring are not defined.
Why Relevant: The bill establishes civil and criminal penalties for organizations and individuals that fail to verify age or disseminate non-consensual sexual images.
Mechanism of Influence: Creates legal and financial incentives for compliance with age verification and nonconsensual image rules.
Evidence:
Ambiguity Notes: Penalties are not tied to risk assessments or proactive scanning.
Legislation ID: 216511
Bill URL: View Bill
This legislation mandates that application developers and manufacturers implement specific parental control features and age verification processes for applications designed for children. It requires developers to notify application stores if their apps are likely to be accessed by minors, and it outlines the obligations of manufacturers and application stores in managing and monitoring access to these applications. The bill also provides enforcement mechanisms through the attorney general to ensure compliance.
Date | Action |
---|---|
2025-04-02 | Referred to committee |
2025-04-01 | Introduced |
Why Relevant: The bill explicitly mandates app stores to implement age verification and parental consent for users under sixteen, and to provide parental supervision tools.
Mechanism of Influence: App stores must build infrastructure to verify user ages, obtain parental consent, and enable parental oversight, similar to the EU Chat Act's age verification and app-store gatekeeping requirements.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification or parental controls, nor does it address how robust or intrusive these checks must be.
Why Relevant: Developers are required to assess whether their applications are likely to be accessed by children and to notify application stores accordingly.
Mechanism of Influence: This creates a risk assessment duty for developers, though limited to the likelihood of child access rather than broader risk mitigation or ongoing assessment.
Evidence:
Ambiguity Notes: No explicit ongoing risk assessment or mitigation plan required beyond the initial notification; does not mention product feature changes or recommender system adjustments.
Why Relevant: Manufacturers must estimate the age of the primary user upon device activation and during certain updates, implementing age estimation/verification at the device level.
Mechanism of Influence: This imposes an age estimation/verification infrastructure requirement on device manufacturers, which is a strong parallel to age assessment provisions in the EU Chat Act.
Evidence:
Ambiguity Notes: The method of age estimation is not described; it is unclear whether this is self-declaration, technical estimation, or third-party verification.
Why Relevant: The bill gives enforcement power to the attorney general, including civil actions and damages for violations, providing a compliance and enforcement mechanism.
Mechanism of Influence: This ensures that the statutory duties for age verification and parental controls are enforceable, similar to the compliance infrastructure elements of the EU Chat Act.
Evidence:
Ambiguity Notes: No mention of audits, transparency reporting, or legal representatives; enforcement appears limited to post-hoc actions rather than proactive compliance infrastructure.
Why Relevant: The bill requires developers to provide features for parental management of children's use of applications, including linked account management, age-appropriate content, and usage time limits.
Mechanism of Influence: This mandates the implementation of parental control features, which may influence app design and user privacy, though does not reach scanning or content moderation obligations.
Evidence:
Ambiguity Notes: No mention of compelled detection, content scanning, or moderation of interpersonal messaging; limited to access and usage controls.
Legislation ID: 129218
Bill URL: View Bill
This bill introduces several new sections in the Revised Code that define terms related to age verification and parental consent. It outlines the responsibilities of application distribution providers, developers of covered applications, operating system providers, and internet browsers or search engines in ensuring that minors are adequately protected from inappropriate content. The bill mandates the implementation of age signals and parental controls, and it provides legal protections for those who make good faith efforts to comply with these requirements.
Date | Action |
---|---|
2025-04-30 | Referred to committee |
2025-04-09 | Introduced |
Why Relevant: The bill requires application distribution providers and operating system providers to verify the age of account holders and enable technical controls for parental oversight.
Mechanism of Influence: Mandates age verification and parental controls, and requires age signals to be shared with developers to enforce age-based restrictions.
Evidence:
Ambiguity Notes: The term 'age signal' and the methods of verification are not described in detail; does not specify whether verification must be highly reliable or what technical standards apply.
Why Relevant: Developers must obtain verifiable parental consent before allowing minors to access restricted content.
Mechanism of Influence: Imposes a duty on developers to gate certain content behind parental consent, creating a compliance infrastructure for age gating.
Evidence:
Ambiguity Notes: Does not define what constitutes 'verifiable parental consent' or specify technical standards.
Legislation ID: 56352
Bill URL: View Bill
House Bill 1275 seeks to implement regulations on social media platforms regarding minors access. It prohibits account creation for minors under sixteen and requires parental consent for those aged sixteen and seventeen. The bill mandates age verification processes and restricts social media companies from certain practices that could harm minors. Violations can be reported to the Attorney General, who has the authority to enforce compliance.
Date | Action |
---|---|
2025-04-22 | Placed on General Order |
2025-04-17 | Enacting clause stricken |
2025-04-17 | Reported Do Pass as amended Technology and Telecommunications committee; CR filed |
2025-04-01 | Second Reading referred to Technology and Telecommunications |
2025-03-25 | First Reading |
2025-03-25 | Engrossed, signed, to Senate |
2025-03-24 | Amended |
2025-03-24 | Title stricken |
Why Relevant: The bill imposes age verification requirements for all account holders and restricts access for users under 16, with parental consent required for 16- and 17-year-olds.
Mechanism of Influence: Mandates that social media companies must verify the age of users, either directly or through third-party vendors, before allowing account creation.
Evidence:
Ambiguity Notes: The bill specifies that companies "must verify the age of account holders" and allows use of "third-party vendors for age verification," but does not detail technical standards or error rates.
Why Relevant: The bill restricts processing and profiling of minors' personal information and prohibits collection of unnecessary data.
Mechanism of Influence: Imposes data minimization and risk-based processing restrictions, requiring platforms to implement safeguards for any profiling and to avoid processing if it poses a risk.
Evidence:
Ambiguity Notes: The specifics of what constitutes 'unnecessary' or 'risky' data processing are not defined, leaving room for interpretation.
Legislation ID: 55886
Bill URL: View Bill
House Bill 1346 introduces significant amendments to existing laws regarding child pornography and related offenses in Oklahoma. It modifies definitions, establishes stricter penalties for offenders, and clarifies the responsibilities of commercial entities in preventing the distribution of harmful materials. The bill also emphasizes the importance of age verification methods for online content access and aims to strengthen the Oklahoma Sentencing Modernization Act.
Date | Action |
---|---|
2025-04-02 | Second Reading referred to Judiciary |
2025-03-31 | Engrossed, signed, to Senate |
2025-03-31 | First Reading |
2025-03-25 | Referred for engrossment |
2025-03-25 | Emergency added |
2025-03-25 | Amended |
2025-03-25 | General Order |
2025-03-25 | Third Reading, Measure and Emergency passed: Ayes: 91 Nays: 0 |
Why Relevant: The bill introduces requirements for 'reasonable age verification methods' for access to harmful materials online, which is a core feature of the EU Chat Act's approach to preventing child sexual abuse online.
Mechanism of Influence: Commercial entities must implement age verification to ensure users are 18+ before accessing certain online materials, potentially affecting privacy and user rights.
Evidence:
Ambiguity Notes: The bill does not specify the technical standards or enforcement mechanisms for age verification, nor does it extend these requirements to interpersonal communication services or mandate risk assessments, detection/scanning, or app-store gatekeeping.
Legislation ID: 56373
Bill URL: View Bill
House Bill 1762 establishes requirements for covered entities that offer online products and services likely to be accessed by children. It mandates the completion of data protection impact assessments, sets default privacy settings, and prohibits the processing of childrens data in ways that are not aligned with their best interests. The bill also outlines penalties for violations and clarifies the confidentiality of the assessments.
Date | Action |
---|---|
2025-02-10 | Referred to Rules |
2025-02-10 | Withdrawn from Children, Youth and Family Services Committee |
2025-02-10 | Withdrawn from Health and Human Services Oversight Committee |
2025-02-04 | Second Reading referred to Health and Human Services Oversight |
2025-02-04 | Referred to Children, Youth and Family Services |
2025-02-03 | Authored by Representative Kerbs |
2025-02-03 | First Reading |
Why Relevant: The bill requires 'data protection impact assessments' for online products likely to be accessed by children, with ongoing review and submission to the Attorney General.
Mechanism of Influence: This creates a binding risk assessment and mitigation planning duty for covered entities, similar to the EU Chat Act’s risk assessment obligations.
Evidence:
Ambiguity Notes: The language focuses on risks to privacy and children's best interests, but does not explicitly require mitigation of online child sexual abuse or grooming risks.
Legislation ID: 63311
Bill URL: View Bill
The Safe Screens for Kids Act establishes regulations for social media platforms regarding the use by minors. It mandates that minors must obtain parental consent to create or maintain accounts, ensures parents have access to their childrens accounts, and prohibits the collection of identifiable data from minors. Additionally, it restricts the use of algorithms and targeted advertisements aimed at minors, and empowers the Attorney General to enforce compliance with these regulations.
Date | Action |
---|---|
2025-03-11 | Coauthored by Representative Caldwell (Chad) (principal House author) |
2025-03-04 | Placed on General Order |
2025-02-27 | Reported Do Pass as amended Technology and Telecommunications committee; CR filed |
2025-02-04 | Second Reading referred to Technology and Telecommunications |
2025-02-03 | First Reading |
2025-02-03 | Authored by Senator Seifried |
Why Relevant: The bill mandates age verification and parental consent for minors to use social media platforms.
Mechanism of Influence: Platforms must implement age verification technologies and processes to reliably identify minor users and confirm parental consent before account creation or maintenance.
Evidence:
Ambiguity Notes: Details on the technical requirements for age verification are left to future rulemaking by the Attorney General.
Why Relevant: The bill prohibits algorithmic personalization and targeted ads for minors, addressing content recommendation systems.
Mechanism of Influence: Platforms must disable or alter algorithms, AI, or machine learning systems for minor users, preventing personalized feeds or recommendations based on user data.
Evidence:
Ambiguity Notes: Does not mandate risk assessments or ongoing mitigation plans, but directly bans certain algorithmic features for minors.
Why Relevant: Requires platforms to provide parents with full access to their children's accounts, including monitoring and content modification.
Mechanism of Influence: Mandates technical features to allow parental account access, monitoring, and content control for minor accounts.
Evidence:
Ambiguity Notes: The scope of parental controls is broad but not tied to specific risk assessment or mitigation frameworks.
Why Relevant: The bill restricts data collection and targeted advertising for minors.
Mechanism of Influence: Platforms must alter data collection and ad targeting practices to exclude or de-identify minor users.
Evidence:
Ambiguity Notes: Does not require reporting or scanning for CSAM, but does impose privacy-related duties.
Why Relevant: Empowers the Attorney General to enforce and issue rules, including for age verification and parental consent.
Mechanism of Influence: Centralizes enforcement and allows for additional compliance requirements via rulemaking.
Evidence:
Ambiguity Notes: No explicit mention of risk assessments, mitigation plans, or CSAM detection/scanning orders.
Legislation ID: 63315
Bill URL: View Bill
Senate Bill 931 aims to implement measures that safeguard minors on social media platforms by mandating age verification, data privacy protections, and the provision of supervisory tools for parents. It outlines the responsibilities of social media companies in handling personal information of minor users, including setting default privacy settings, allowing parental consent for data changes, and providing options for minors to delete their information. Additionally, the bill empowers the Attorney General to enforce compliance and seek legal remedies for violations.
Date | Action |
---|---|
2025-03-04 | Placed on General Order |
2025-02-27 | Reported Do Pass as amended Technology and Telecommunications committee; CR filed |
2025-02-27 | Title stricken |
2025-02-27 | Coauthored by Representative Caldwell (Chad) (principal House author) |
2025-02-04 | Second Reading referred to Technology and Telecommunications |
2025-02-03 | First Reading |
2025-02-03 | Authored by Senator Jech |
Why Relevant: The bill requires social media platforms to implement 'reasonable age verification methods' for all users and restricts minors from changing privacy settings without parental consent.
Mechanism of Influence: This is a direct mandate for age verification and privacy-by-default settings for minors, impacting user privacy and potentially requiring collection of sensitive identity data.
Evidence:
Ambiguity Notes: The term 'reasonable age verification' is not defined in detail, leaving room for interpretation as to what constitutes compliance (e.g., government ID, third-party verification, etc.).
Why Relevant: Mandates parental supervisory tools for social media accounts of minors.
Mechanism of Influence: Requires platforms to provide parents access to their child's account settings and activity, impacting children's privacy.
Evidence:
Ambiguity Notes: Does not specify technical implementation or data access limitations.
Legislation ID: 115795
Bill URL: View Bill
House Bill 2032 mandates that commercial entities verify the ages of individuals accessing sexual material harmful to minors and imposes penalties for non-compliance. It specifies methods for age verification and requires the destruction of personal information used in the process. The bill allows individuals to sue businesses that fail to comply with these age verification requirements.
Date | Action |
---|---|
2025-06-27 | In committee upon adjournment. |
2025-01-17 | Referred to Judiciary. |
2025-01-13 | First reading. Referred to Speakers desk. |
Why Relevant: The bill imposes explicit age verification obligations on commercial entities distributing sexual material online, which is one of the core elements of the EU Chat Act.
Mechanism of Influence: Providers must implement 'reasonable methods' (e.g., government ID, digital verification systems) to reliably identify whether a user is under 18 before granting access to specified content.
Evidence:
Ambiguity Notes: The bill does not specify requirements for interpersonal communication services, risk assessments, detection/scanning, or app store gatekeeping. The scope is limited to age gating for access to certain material.
Legislation ID: 115948
Bill URL: View Bill
House Bill 2308 mandates that consumer product manufacturers must include a parental control filter in devices that connect to the internet. This filter must automatically activate for users under 18 years of age and block access to obscene material. The Attorney General is empowered to enforce compliance and take legal action against manufacturers who fail to adhere to these requirements. The bill becomes effective on July 1, 2026.
Date | Action |
---|---|
2025-06-27 | In committee upon adjournment. |
2025-01-17 | Referred to Commerce and Consumer Protection. |
2025-01-13 | First reading. Referred to Speakers desk. |
Why Relevant: The bill mandates age determination and automatic activation of parental controls for minors, which is a form of age verification and access gating.
Mechanism of Influence: Devices must 'determine the age of each user during setup' and 'automatically activate' the filter for users under 18, effectively requiring age assessment at the device level.
Evidence:
Ambiguity Notes: The bill does not specify the technical method for age determination (e.g., self-declaration, ID upload, biometric), leaving implementation details open.
Why Relevant: The bill requires a filter that blocks access to 'obscene material' for minors, which is a form of mandatory content restriction.
Mechanism of Influence: Manufacturers must ensure devices include and activate a filter for certain content, resulting in technical restrictions on access.
Evidence:
Ambiguity Notes: Scope is limited to 'obscene material'; does not explicitly mention CSAM or broader categories.
Why Relevant: The bill gives enforcement powers to the state Attorney General, including investigation and civil penalties.
Mechanism of Influence: Manufacturers are subject to legal action and penalties for non-compliance, creating a strong incentive to implement the required controls.
Evidence:
Ambiguity Notes: None
Legislation ID: 117089
Bill URL: View Bill
House Bill 3696 establishes regulations for software applications used by minors in Oregon. It mandates that app stores verify the age of users, categorize them into specific age groups, and require parental consent before minors can download or make purchases in apps. Developers must display clear age ratings and provide tools for parents to limit app usage. The bill includes provisions for enforcement, penalties for non-compliance, and the establishment of an advisory committee to improve age rating transparency.
Date | Action |
---|---|
2025-02-27 | Referred to Commerce and Consumer Protection. |
2025-02-25 | First reading. Referred to Speakers desk. |
Why Relevant: The bill imposes explicit duties on app stores to verify user age and require parental consent before minors can access or purchase apps, mirroring a core 'gatekeeping' feature of the EU Chat Act.
Mechanism of Influence: App stores must implement age verification systems and block access to apps for minors without verified parental consent. They must also disclose age ratings and provide parental controls.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification, leaving some latitude in implementation.
Why Relevant: Developers are required to display accurate age ratings, notify app stores of changes, and provide parental control features, which are elements of risk mitigation and transparency.
Mechanism of Influence: Developers must integrate parental controls and ensure age ratings are clear and kept up to date.
Evidence:
Ambiguity Notes: No obligation for developers to conduct ongoing risk assessments or modify app functionality in response to risk.
Why Relevant: Enforcement provisions allow for civil penalties and private lawsuits, ensuring compliance.
Mechanism of Influence: Violators face financial penalties and injunctions; parents can seek damages for violations.
Evidence:
Ambiguity Notes: Enforcement is focused on compliance with age verification and parental consent, not broader content moderation.
Legislation ID: 194027
Bill URL: View Bill
This legislation amends Title 50 of the Pennsylvania Consolidated Statutes to introduce measures aimed at safeguarding minors using social media. It includes provisions for parental consent, reporting mechanisms for hateful conduct, and penalties for social media companies that fail to comply with these regulations. The bill recognizes the significant impact of social media on the mental health of minors and seeks to empower parents while holding companies accountable for the safety of young users.
Date | Action |
---|---|
2025-05-08 | Referred toCommunications & Technology |
Why Relevant: The bill requires social media platforms to verify user age and obtain/document parental consent for minors, which is a form of age verification.
Mechanism of Influence: Platforms must implement processes to reliably identify minors and prevent account creation without parental consent, potentially requiring robust age verification systems.
Evidence:
Ambiguity Notes: The summary does not specify technical standards for age verification (e.g., biometric, ID upload), nor does it address ongoing age assessment or risk-based gating for high-risk services.
Legislation ID: 194102
Bill URL: View Bill
This legislation amends Title 18 of the Pennsylvania Consolidated Statutes to include provisions regarding liability for Internet publishers and distributors of harmful material to minors. It outlines the responsibilities of commercial entities in verifying the age of users accessing such material and prohibits the retention or sharing of personal identifying information of minors. The bill also provides for legal recourse for violations, including damages and injunctive relief.
Date | Action |
---|---|
2025-05-29 | Referred toCommunications & Technology |
Why Relevant: The bill includes a statutory duty for commercial entities to implement 'reasonable age verification measures' to restrict minors' access to certain online material.
Mechanism of Influence: It compels websites hosting 'harmful to minors' content to verify user age, likely requiring upload of ID or use of third-party verification services.
Evidence:
Ambiguity Notes: 'Reasonable age verification measure' is not defined in detail; the scope of 'harmful to minors' and technical requirements are left open to interpretation.
Why Relevant: Prohibits retention or sharing of personal identifying information collected for age verification.
Mechanism of Influence: Limits the privacy impact of age verification by banning storage or disclosure of user data collected during the process.
Evidence:
Ambiguity Notes: The bill does not specify audit or enforcement mechanisms for these prohibitions.
Why Relevant: Provides for court-ordered injunctive relief and statutory damages for violations, creating legal recourse for enforcement.
Mechanism of Influence: Enables individuals to sue non-compliant sites, increasing compliance pressure.
Evidence:
Ambiguity Notes: No mention of administrative enforcement, only private right of action.
Why Relevant: Exempts ISPs and platforms that do not control content, limiting the scope to direct publishers/distributors.
Mechanism of Influence: Means the law does not impose scanning or monitoring duties on general-purpose platforms or ISPs.
Evidence:
Ambiguity Notes: Scope of 'control' may be disputed in edge cases.
Legislation ID: 215575
Bill URL: View Bill
This legislation amends Title 18 of the Pennsylvania Consolidated Statutes to introduce protections for children using online services. It outlines various provisions including age verification, parental approvals for connections and transactions, privacy settings, and prohibitions against manipulative design practices. The bill seeks to ensure that children can use online platforms safely and that parents have control over their childrens online interactions.
Date | Action |
---|---|
2025-07-14 | Referred toCommunications & Technology |
Why Relevant: The bill explicitly requires operators to conduct age verification for users and to set privacy defaults for minors, which are core elements of the EU Chat Act's risk mitigation and age-gating features.
Mechanism of Influence: Operators must implement age verification systems and default privacy settings that limit minors’ exposure to unconnected users, effectively gating access and limiting unsolicited contact.
Evidence:
Ambiguity Notes: The bill requires age verification but leaves the specific method and accuracy level to be defined via regulation by the Attorney General, introducing some ambiguity about how intrusive or effective these measures will be.
Why Relevant: Requires parental approval for new connections and financial transactions for minors under 13, which is a risk mitigation strategy, though less comprehensive than the EU Chat Act’s ongoing risk assessment and mitigation planning.
Mechanism of Influence: Mandates mechanisms for parental oversight and approval, which may require operators to develop new compliance infrastructure and user management systems.
Evidence:
Ambiguity Notes: Does not require formal risk assessments or ongoing mitigation plans as in the EU Chat Act, but the parental approval mechanism is a form of risk mitigation.
Why Relevant: Empowers the Attorney General to promulgate rules about age verification methods and accuracy, allowing for future regulatory expansion that could approximate some compliance infrastructure aspects of the EU Chat Act.
Mechanism of Influence: Potentially enables the creation of technical standards for age verification and enforcement, though not as prescriptive as the EU Chat Act.
Evidence:
Ambiguity Notes: Actual technical and compliance requirements are deferred to future regulations, so the bill’s current text is less detailed than the EU Chat Act.
Why Relevant: The bill prohibits manipulative design practices ('dark patterns') that could undermine user or parental safety choices, which is an indirect risk mitigation measure.
Mechanism of Influence: Forces operators to avoid UI/UX designs that circumvent safety or parental controls, supporting the intent of risk mitigation.
Evidence:
Ambiguity Notes: No explicit requirement for technical detection of CSAM, reporting to authorities, or scanning of private communications (unlike the EU Chat Act).
Legislation ID: 194563
Bill URL: View Bill
This bill amends Title 50 of the Pennsylvania Consolidated Statutes to include provisions specifically aimed at protecting minors on social media. It outlines the responsibilities of social media companies to verify the age of users, obtain parental consent for minors, and ensure that minors are not exposed to harmful content. The bill also establishes penalties for violations of these provisions, thereby aiming to reduce the risks associated with social media use among youth.
Date | Action |
---|---|
2025-01-24 | Referred toCommunications & Technology |
Why Relevant: The bill mandates 'social media companies must verify the age of users' and 'requires parental consent for minors to open accounts,' which are direct forms of age verification/assessment and parental gating.
Mechanism of Influence: Social media platforms must implement age verification systems and processes for obtaining, storing, and acting on parental consent, including revocation (account suspension/deletion).
Evidence:
Ambiguity Notes: The bill summary does not specify the technical means of age verification or the standard for verifying parental consent, which could range from self-attestation to government ID checks.
Why Relevant: The bill prohibits social media companies from knowingly allowing minors to access 'harmful content,' creating a statutory duty to restrict content based on age.
Mechanism of Influence: Platforms may need to implement content classification/filtering to comply, though no explicit scanning or detection technology is mandated.
Evidence:
Ambiguity Notes: No mention of automated scanning, detection orders, or technical content detection—duty is on access restriction, not proactive scanning.
Why Relevant: The bill allows parents to revoke consent at any time, requiring platforms to suspend or delete accounts accordingly.
Mechanism of Influence: Platforms must maintain systems to track parental consent status and act promptly on revocation requests.
Evidence:
Ambiguity Notes: Does not specify timelines or technical requirements for account suspension/deletion.
Legislation ID: 195105
Bill URL: View Bill
This bill amends Title 18 of the Pennsylvania Consolidated Statutes by adding a subchapter that mandates commercial entities with publicly accessible websites containing harmful material to implement reasonable age verification methods. It defines harmful material and outlines the responsibilities and liabilities of these entities regarding the protection of minors online.
Date | Action |
---|---|
2025-04-09 | Referred toJudiciary |
Why Relevant: The bill explicitly mandates age verification for access to certain online content by minors.
Mechanism of Influence: Websites with harmful material must implement age verification to restrict minors' access.
Evidence:
Ambiguity Notes: 'Reasonable age verification methods' is not further specified, so the exact technical requirements are unclear. The scope is limited to websites with harmful material, not general social media or messaging platforms.
The Childrens Default to Safety Act seeks to amend the South Carolina Code of Laws by adding Article 9 to Chapter 5, Title 39. It requires manufacturers to ensure that devices activated in South Carolina automatically enable filters to block harmful content for minors, outlines definitions related to the act, and establishes civil and criminal liabilities for non-compliance. Additionally, it prohibits unauthorized individuals from providing passcodes to bypass these filters.
Date | Action |
---|---|
2025-02-04 | Member(s) request name added as sponsor: Govan, Erickson, Bradley |
2025-01-14 | Introduced and read first time ( House Journal-page 190 ) |
2025-01-14 | Referred to Committee on Judiciary ( House Journal-page 190 ) |
2024-12-05 | Prefiled |
2024-12-05 | Referred to Committee on Judiciary |
Why Relevant: The bill mandates that device manufacturers implement default content filters for minors, which is a form of risk mitigation aimed at preventing access to harmful content.
Mechanism of Influence: Manufacturers must ensure filters are enabled by default and passcode-protected, directly impacting device functionality for minors.
Evidence:
Ambiguity Notes: The term 'harmful to minors' is defined, but the technical requirements for filtering are not specified in detail.
Why Relevant: The law establishes liability for manufacturers if minors access harmful material due to non-compliance, creating a strong incentive for ongoing compliance.
Mechanism of Influence: Manufacturers face civil penalties and private lawsuits if their filters fail or are not implemented.
Evidence:
Ambiguity Notes: Does not specify technical compliance audits or oversight beyond legal liability.
Why Relevant: The law prohibits non-guardians from providing passcodes to remove filters, creating a criminal offense for circumvention.
Mechanism of Influence: Imposes criminal penalties for unauthorized passcode sharing, reinforcing filter integrity.
Evidence:
Ambiguity Notes: Does not address technical enforcement or detection of circumvention.
Why Relevant: The Attorney General is empowered to enforce the law, including seeking penalties and injunctions against manufacturers.
Mechanism of Influence: Adds a layer of public enforcement in addition to private rights of action.
Evidence:
Ambiguity Notes: No explicit mention of coordinated reporting or centralized routing of complaints.
This bill introduces regulations for online services and products targeting children, mandating that they be designed to consider the best interests of children. It outlines definitions related to childrens data, requirements for covered entities, and enforcement measures to ensure compliance with data protection standards. The legislation specifies how personal data should be handled and emphasizes the importance of safeguarding childrens privacy and safety online.
Date | Action |
---|---|
2025-01-14 | Introduced and read first time ( House Journal-page 191 ) |
2025-01-14 | Referred to Committee on Judiciary ( House Journal-page 191 ) |
2024-12-05 | Prefiled |
2024-12-05 | Referred to Committee on Judiciary |
Why Relevant: The bill requires 'data protection impact assessments' and mandates that online services set privacy settings for children to high by default.
Mechanism of Influence: Covered entities must assess and document risks to children’s privacy and implement technical and policy controls to mitigate them.
Evidence:
Ambiguity Notes: The DPIA requirement is focused on privacy/data protection, not specifically on CSAM or abuse risk; 'risk assessment' is present but context is privacy, not content moderation.
Why Relevant: The Attorney General can request DPIAs and enforce compliance.
Mechanism of Influence: Enforcement ensures that covered entities conduct and provide DPIAs, but does not extend to compelled scanning or content moderation.
Evidence:
Ambiguity Notes: AG's powers are limited to privacy/data protection compliance.
Why Relevant: Prohibits harmful or manipulative practices targeting children, including dark patterns and targeted advertising.
Mechanism of Influence: Prevents certain forms of data-driven manipulation or exploitation, but not specific to CSAM or abuse detection.
Evidence:
Ambiguity Notes: Prohibitions are about privacy and digital well-being, not content moderation or detection.
The App Store Accountability Act seeks to regulate app stores by mandating age verification for users, requiring parental consent for minors, and ensuring that content ratings are clear and accurate. It establishes guidelines for app store providers and developers to follow, with enforcement mechanisms and potential penalties for non-compliance, while emphasizing the importance of parental involvement in minors app usage.
Date | Action |
---|---|
2025-04-23 | Member(s) request name added as sponsor: Martin |
2025-01-14 | Referred to Committee on Judiciary ( House Journal-page 192 ) |
2025-01-14 | Introduced and read first time ( House Journal-page 192 ) |
2024-12-05 | Prefiled |
2024-12-05 | Referred to Committee on Judiciary |
Why Relevant: The bill explicitly requires app store providers to verify user ages and obtain parental consent before allowing minors to access apps.
Mechanism of Influence: App stores must implement reliable age verification and parental consent mechanisms for every download or purchase by a minor. This is a direct, binding duty on platforms.
Evidence:
Ambiguity Notes: The technical standard for 'age verification' is not specified, but the duty is mandatory and includes per-download consent.
Why Relevant: App store providers must provide parental controls, including blocking unsuitable apps and setting usage limits for minors.
Mechanism of Influence: Platforms must offer technical means for parents to restrict app access and usage by minors, enforcing parental involvement.
Evidence:
Ambiguity Notes: No mention of compelled detection/scanning or risk assessments, but strong age-gating and access control duties are present.
Why Relevant: Developers and app stores must display clear age ratings and content descriptions, with notification and additional consent required for rating changes.
Mechanism of Influence: This increases transparency and strengthens parental control over app content exposure for minors.
Evidence:
Ambiguity Notes: No explicit risk assessment or mitigation plan duty for platforms, only for content labeling and parental notification.
Legislation ID: 196584
Bill URL: View Bill
Bill 3406 introduces regulations for covered platform operators to ensure that pornographic images can only be uploaded with verified consent from individuals depicted. It outlines definitions related to consent, age verification, and procedures for handling requests for image removal. The bill also establishes civil and criminal penalties for violations, as well as a legislative committee to address issues related to online technological exploitation.
Date | Action |
---|---|
2025-01-14 | Referred to Committee on Judiciary ( House Journal-page 192 ) |
2025-01-14 | Introduced and read first time ( House Journal-page 192 ) |
2024-12-05 | Prefiled |
2024-12-05 | Referred to Committee on Judiciary |
Why Relevant: The bill requires platforms to verify the age and identity of users uploading pornographic images.
Mechanism of Influence: Mandates that operators confirm users are at least 18 years old and verify their identity before allowing uploads, using methods such as adult access codes or digital certificates.
Evidence:
Ambiguity Notes: Verification applies only to users uploading pornographic images, not all users or all communications; does not specify continuous or platform-wide age gating.
Why Relevant: Operators must obtain and maintain consent forms for all depicted individuals.
Mechanism of Influence: Requires platforms to collect and store detailed consent documentation, including age and identification, for any person depicted in uploaded content.
Evidence:
Ambiguity Notes: Scope is limited to pornographic images and does not extend to all user-generated content or messaging.
Why Relevant: Mandates procedures for removing non-consensual pornographic images from platforms upon request.
Mechanism of Influence: Operators must have designated staff and display removal instructions, and must act on removal requests from eligible individuals.
Evidence:
Ambiguity Notes: No explicit statutory deadline for removal or penalties for non-compliance beyond general civil penalties.
Why Relevant: Imposes civil and criminal penalties on platforms and individuals for violations.
Mechanism of Influence: Creates financial and legal incentives for compliance with age/identity verification and consent requirements.
Evidence:
Ambiguity Notes: Penalties are tied to violations of the pornographic image provisions only.
Bill 444, known as the South Carolina Sports Wagering Act, aims to legalize sports wagering and ancillary activities under specified conditions. It establishes the South Carolina Sports Wagering Commission to oversee the licensing, regulation, and enforcement of sports wagering activities, while also providing definitions and requirements for operators and participants in the sports wagering industry.
Date | Action |
---|---|
2025-03-12 | Introduced and read first time |
2025-03-12 | Referred to Committee on Labor, Commerce and Industry ( Senate Journal-page 4 ) |
Why Relevant: The bill requires 'verification of identity and age' for all bettors (Registration Requirements for Bettors).
Mechanism of Influence: Operators must ensure only adults (not minors) can register and wager, using some form of age and identity verification.
Evidence:
Ambiguity Notes: The bill does not specify the technical means or reliability standards for age verification; it only requires verification as a condition of registration.
Legislation ID: 75721
Bill URL: View Bill
House Bill 1053 amends existing laws to require covered platforms to implement reasonable age verification methods to prevent minors from accessing harmful material. It defines harmful material, outlines the responsibilities of covered platforms, and establishes penalties for violations, including misdemeanors and felonies for repeated offenses. The bill also includes provisions for civil penalties and specifies that certain entities, like search engines, are exempt from these requirements.
Date | Action |
---|---|
2025-02-27 | Signed by the Governor |
2025-02-24 | Delivered to the Governor |
2025-02-21 | Signed by the President |
2025-02-20 | Signed by the Speaker |
2025-02-19 | Do Pass as amended |
2025-02-18 | Schedule for hearing |
2025-02-18 | Do Pass |
2025-01-27 | Refer to Committee |
Why Relevant: The bill explicitly requires age verification for access to certain online content, which is a core element of the EU Chat Act's approach to online child protection.
Mechanism of Influence: Covered platforms must implement 'reasonable age verification processes' to ensure users are 18+ before accessing harmful material. This imposes a technical and compliance burden on platforms and impacts user privacy by requiring age checks.
Evidence:
Ambiguity Notes: The bill does not specify the technical method of age verification, leaving room for a range of implementations (from self-attestation to ID upload). It does not mandate age verification for all users—only those accessing 'harmful material.'
Legislation ID: 76261
Bill URL: View Bill
Senate Bill 18 mandates that websites containing material deemed harmful to minors must verify the age of users attempting to access such content. The bill outlines definitions related to harmful material, specifies the requirements for age verification, and sets forth penalties for violations, including misdemeanors and felonies for repeated offenses. It also clarifies that general-purpose search engines and internet service providers are exempt from these requirements.
Date | Action |
---|---|
2025-02-18 | Table |
2025-02-18 | Schedule for hearing |
2025-01-14 | First read in Senate and referred |
Why Relevant: The bill creates a binding duty for covered platforms to implement age verification for access to certain online content.
Mechanism of Influence: Websites must verify that users are at least 18 using government-issued ID or financial account information, and are subject to fines for non-compliance.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for verification, nor does it require ongoing risk assessments or mitigation plans beyond age gating.
Why Relevant: The bill prohibits retention of identifying information after access is granted.
Mechanism of Influence: Platforms must delete age verification data after granting access, limiting long-term privacy risks.
Evidence:
Ambiguity Notes: No details are given on how deletion is enforced or audited.
Why Relevant: The bill establishes penalties and a compliance notification process.
Mechanism of Influence: Non-compliant platforms are subject to fines and must implement corrective measures within ninety days of notice.
Evidence:
Ambiguity Notes: No mention of risk assessments, reporting to authorities, or broader compliance infrastructure.
Legislation ID: 76696
Bill URL: View Bill
Senate Bill 180 aims to implement age verification protocols for users of online application stores in South Dakota. The bill defines various age categories and outlines the responsibilities of application store providers and developers in verifying user ages and obtaining parental consent for minors. It establishes penalties for violations and ensures the protection of personal data related to age verification.
Date | Action |
---|---|
2025-02-18 | Amend |
2025-02-18 | Schedule for hearing |
2025-02-18 | Do Pass |
2025-02-18 | Defer to the 41st legislative day |
2025-02-03 | Refer to Committee |
2025-01-30 | First Reading Senate |
Why Relevant: The bill imposes binding age verification and parental consent requirements on application store providers and developers, similar to the EU Chat Act's age-assessment and app-store gatekeeping features.
Mechanism of Influence: Providers must verify age at account creation and obtain parental consent for minors before allowing access to certain applications. Developers must notify stores if their apps are likely to be accessed by children and ensure compliance with age verification.
Evidence:
Ambiguity Notes: The bill does not specify the technical methods for age verification, leaving standards to be set by the Attorney General. It is unclear how robust or privacy-invasive the verification must be.
Why Relevant: The bill explicitly targets app-store gatekeeping by placing duties on application store providers to verify age and manage parental consent.
Mechanism of Influence: App stores must implement age checks and cannot allow minors to access certain apps without verified parental consent.
Evidence:
Ambiguity Notes: The scope of 'covered applications' is not fully defined in the summary, so it is unclear if all apps or only some are included.
Legislation ID: 131530
Bill URL: View Bill
This bill amends the Business & Commerce Code to establish age limitations for social media platform users and requires electronic devices to have filters and markers that prevent minors from accessing explicit content. It also mandates schools to block access to social media during instructional time and ensures that personal information used for age verification is not retained.
Date | Action |
---|---|
2025-05-14 | Placed on General State Calendar |
2025-05-12 | Considered in Calendars |
2025-05-10 | Committee report sent to Calendars |
2025-05-10 | Committee report distributed |
2025-05-09 | Comte report filed with Committee Coordinator |
2025-04-30 | Reported favorably as substituted |
2025-04-30 | Considered in public hearing |
2025-04-30 | Committee substitute considered in committee |
Why Relevant: The bill requires social media platforms to verify user age and prohibits children under 18 from using social media.
Mechanism of Influence: This creates a statutory duty for platforms to implement robust age verification, which is a core feature of the EU Chat Act.
Evidence:
Ambiguity Notes: The exact technical standards for age verification are not specified, but deletion of verification data is required.
Why Relevant: Manufacturers of electronic devices must implement markers and filters to prevent minors from accessing explicit material and social media.
Mechanism of Influence: This compels device-level filtering and identification of minor users, which is analogous to compelled technical mitigation.
Evidence:
Ambiguity Notes: Unclear if this applies to all devices or just those supplied by schools; 'marker' function may raise privacy concerns.
Legislation ID: 220881
Bill URL: View Bill
This bill, titled Sammys Law, introduces regulations for large social media platforms to ensure child safety through the use of third-party safety management software. It outlines definitions, requirements for platforms, and the role of the Department of Information Resources in overseeing compliance and reporting. The bill also includes provisions for user data management, limitations on liability, and disclosure requirements to protect children from cyberbullying, trafficking, and other online dangers.
Date | Action |
---|---|
2025-04-01 | Read first time |
2025-04-01 | Referred to Trade, Workforce & Economic Development |
2025-03-11 | Filed |
Why Relevant: Mandates 'real-time APIs' for third-party safety software to manage children's online activity.
Mechanism of Influence: Platforms must technically enable third-party oversight and intervention in children's online interactions, which may impact privacy and data flows.
Evidence:
Ambiguity Notes: No explicit requirement for scanning/detection or risk assessment by platforms themselves. Focus is on access/control by parents or delegates, not automated moderation or detection.
Legislation ID: 132533
Bill URL: View Bill
This legislation introduces Chapter 121 to the Business & Commerce Code, focusing on the regulation of software application distribution platforms. It outlines definitions, responsibilities of app stores and developers regarding user age verification, parental consent requirements, and the safeguarding of personal data. The bill also specifies penalties for violations and allows parents or guardians to seek legal recourse in the event of breaches.
Date | Action |
---|---|
2025-04-15 | Testimony taken/registration(s) recorded in committee |
2025-04-15 | Left pending in committee |
2025-04-15 | Scheduled for public hearing on . . . |
2025-04-15 | Considered in public hearing |
2025-04-03 | Referred to Trade, Workforce & Economic Development |
2025-04-03 | Read first time |
2025-03-13 | Filed |
Why Relevant: The bill imposes explicit duties on app stores to verify user age and obtain parental consent for minors, and mandates the display of age ratings for apps.
Mechanism of Influence: App stores must implement systems to verify user age and ensure parental consent before minors can download or purchase applications. This is a direct age-gating and age-verification requirement, similar to one of the core EU Chat Act features.
Evidence:
Ambiguity Notes: The bill does not specify the technical standard for age verification (e.g., document checks, biometric, or self-attestation), nor does it mention ongoing risk assessments or mitigation plans beyond age verification.
Why Relevant: Developers are required to assign age ratings and implement a system to verify user age categories.
Mechanism of Influence: Developers must ensure their applications are appropriately rated for age and that mechanisms are in place to verify user age, which supports the app store's compliance obligations.
Evidence:
Ambiguity Notes: No details are given about how age verification is to be implemented or enforced at the developer level, nor about any required risk assessments or content moderation.
Legislation ID: 134852
Bill URL: View Bill
This bill proposes the introduction of mandatory electronic device filters to block obscene materials for devices activated in Texas, particularly focusing on protecting minors. It defines key terms, outlines the requirements for manufacturers, and establishes penalties for violations, including civil actions by parents or guardians against manufacturers and nonparent violators. The bill also stipulates that violations can result in criminal charges.
Date | Action |
---|---|
2025-04-08 | Co-author authorized |
2025-03-25 | Read first time |
2025-03-25 | Referred to State Affairs |
2025-03-13 | Filed |
2025-03-13 | Received by the Secretary of the Senate |
Why Relevant: The bill imposes a statutory duty on device manufacturers to implement and maintain technology that blocks obscene materials for minors, including an age-determination mechanism and filter management.
Mechanism of Influence: Manufacturers must enable filters that 'automatically determine the users age and apply restrictions if the user is a minor' and ensure these filters cannot be disabled or circumvented for minors (see 'Electronic Device Filter Required').
Evidence:
Ambiguity Notes: The method of age determination is not specified—could range from self-declaration to more intrusive verification. 'Obscene' is defined, but the technical scope of filtering is not detailed.
Why Relevant: There is a form of compelled technology deployment (filtering) by device manufacturers, with legal penalties for noncompliance.
Mechanism of Influence: Manufacturers face civil penalties of up to $5,000 per violation or $50,000 total and potential criminal penalties if filters are not enabled or are circumvented (see 'Civil Penalty' and 'Offense; Criminal Penalty').
Evidence:
Ambiguity Notes: No explicit requirement to scan user content or communications, only to block access to certain materials.
Why Relevant: The bill includes a basic form of age assessment/verification as a trigger for content filtering.
Mechanism of Influence: Devices must use an unspecified method to 'automatically determine the users age' to apply restrictions for minors.
Evidence:
Ambiguity Notes: No detail on how age is determined (e.g., user input, document upload, biometrics).
Legislation ID: 212416
Bill URL: View Bill
This bill amends the Business & Commerce Code to include provisions that require digital service providers to implement strict access and communication controls for accounts belonging to known minors. The bill sets default settings for such accounts, limits interactions with unknown users, and allows verified parents to modify these settings. Additionally, it outlines the responsibilities of digital service providers regarding the use of algorithms in relation to minors.
Date | Action |
---|---|
2025-05-25 | Committee report distributed |
2025-05-25 | Comte report filed with Committee Coordinator |
2025-05-22 | Reported favorably w/o amendment(s) |
2025-05-22 | Considered in formal meeting |
2025-05-21 | Scheduled for public hearing on . . . |
2025-05-21 | Left pending in committee |
2025-05-21 | Considered in public hearing |
2025-05-21 | Testimony taken/registration(s) recorded in committee |
Why Relevant: The bill mandates platform-level controls for minor accounts, including restrictions on communications and interactions, which is a form of risk mitigation targeting child safety.
Mechanism of Influence: Providers must proactively configure and enforce technical settings that limit usage, block unsolicited messages, and restrict content interactions for minors. These measures are aimed at reducing risks of child exploitation and harmful contact.
Evidence:
Ambiguity Notes: The bill does not reference ongoing risk assessments, formal mitigation plans, or detection/scanning for CSAM. It also does not mention reporting, removal orders, data preservation, or explicit compliance infrastructure.
Why Relevant: The bill prohibits algorithmic content suggestions for minors and certain social features, which addresses risk mitigation through product design.
Mechanism of Influence: For minor accounts, algorithms cannot be used to suggest content, and social features like comments and public friend lists are disabled, reducing exposure to harmful interactions.
Evidence:
Ambiguity Notes: No reference to broader risk assessment frameworks, app store obligations, or detection orders. Controls are limited to product features, not active monitoring or scanning.
The bill provides definitions for key terms related to app stores, including what constitutes an app store, the role of app store providers, and the nature of mobile devices. It seeks to regulate how apps are made available to users, ensuring that app store providers adhere to certain standards.
Date | Action |
---|---|
2025-03-26 | Governor Signed |
2025-03-17 | Senate/ to Governor |
2025-03-17 | Senate/ received enrolled bill from Printing |
2025-03-14 | Senate/ enrolled bill to Printing |
2025-03-14 | Enrolled Bill Returned to House or Senate |
2025-03-05 | Senate/ placed on Concurrence Calendar |
2025-03-05 | House/ signed by Speaker/ returned to Senate |
2025-03-05 | Draft of Enrolled Bill Prepared |
Why Relevant: The bill directly mandates age verification for app store users and parental consent for minors, and applies these duties to both app store providers and app developers.
Mechanism of Influence: App stores must verify user age at account creation and require parental consent before minors can download or purchase apps. Developers must ensure age verification and parental consent via the app store. The Division of Consumer Protection is authorized to define acceptable age verification methods.
Evidence:
Ambiguity Notes: The bill does not specify technical mechanisms for age verification or parental consent, leaving the details to rulemaking. It does not mandate scanning, detection orders, or risk assessments beyond age and consent verification.
Legislation ID: 200361
Bill URL: View Bill
Act No. 63 creates the Vermont age-appropriate design code, which mandates that businesses protect minors from potential harms when processing their data and using digital products. It requires that the default privacy settings for minors be the highest level of privacy, prohibits the collection or sharing of minors personal data unless necessary for specific services, and restricts monitoring of minors activity without clear notification. The Attorney General is given the authority to define practices that may lead to compulsive use and to suggest methods for age estimation.
Date | Action |
---|---|
2025-06-13 | Signed by Governor on [June 12, 2025] |
2025-06-13 | Senate Message: Signed by Governor [June 12, 2025] |
2025-06-06 | Delivered to Governor on [June 6, 2025] |
2025-05-29 | House proposal of amendment concurred in with further proposal of amendment as moved by Senator(s) [Harrison and Plunkett] |
2025-05-29 | Rules suspended and taken up for immediate consideration, pending entry on Notice Calendar, as moved by Rep. [McCoy of Poultney] |
2025-05-29 | House proposal of amendment; text |
2025-05-29 | Rules suspended and bill messaged forthwith to the Senate as moved by Rep. [McCoy of Poultney] |
2025-05-29 | Rules suspended & messaged to House forthwith, on motion of Senator [Baruth] |
Why Relevant: The Act addresses age estimation and default privacy settings for minors, which are related to age verification and privacy controls.
Mechanism of Influence: Covered businesses may need to implement age estimation to determine which users are minors and must default to highest privacy settings for minors.
Evidence:
Ambiguity Notes: The law references 'methods for age estimation' but does not explicitly mandate robust age verification or specify technical requirements.
Why Relevant: The Act restricts data collection/sharing and monitoring of minors, which relates to risk mitigation and privacy.
Mechanism of Influence: Businesses must avoid unnecessary data collection/sharing and provide notification for monitoring, thus reducing risks to minors.
Evidence:
Ambiguity Notes: The Act does not impose ongoing risk assessment, compelled detection, or scanning requirements.
Why Relevant: The Act gives the Attorney General rulemaking authority, which could affect compliance infrastructure.
Mechanism of Influence: Future rules could define technical or organizational measures businesses must adopt.
Evidence:
Ambiguity Notes: No explicit requirement for legal representatives, reporting, or compliance infrastructure in the statute.
Legislation ID: 7822
Bill URL: View Bill
This bill amends the existing Consumer Data Protection Act by defining terms related to data protection and introduces restrictions on social media platforms regarding the provision of addictive feeds to minors. It outlines the responsibilities of operators of such platforms to ensure that they do not provide addictive feeds unless they can verify the age of the user or obtain parental consent. The bill also clarifies that data collected for age verification cannot be used for other purposes and prohibits operators from penalizing users for not being able to provide addictive feeds.
Date | Action |
---|---|
2025-02-04 | Left in Communications, Technology and Innovation |
2025-01-27 | Subcommittee recommends laying on the table (6-Y 4-N) |
2025-01-23 | Assigned CT & I sub: Communications |
2025-01-14 | Fiscal Impact Statement from Department of Planning and Budget (HB1624) |
2025-01-14 | Referred from General Laws and referred to Communications, Technology and Innovation (Voice Vote) |
2025-01-03 | Referred to Committee on General Laws |
2025-01-03 | Prefiled and ordered printed; Offered 01-08-2025 25102021D |
Why Relevant: The bill imposes age verification requirements on social media platforms for providing addictive feeds to minors.
Mechanism of Influence: Platforms must use 'commercially reasonable methods' to determine a user's age and cannot provide certain features to minors without verification or parental consent.
Evidence:
Ambiguity Notes: The bill does not specify technical standards for age verification, nor does it require risk assessments, detection/scanning, reporting, or data preservation.
Legislation ID: 8015
Bill URL: View Bill
This bill amends existing sections of the Code of Virginia related to consumer data protection and introduces a new section that specifically addresses the requirements for social media platforms regarding the collection and processing of personal data from minors. It defines key terms related to personal data and outlines the scope of the bill, including exemptions and the necessity for verifiable parental consent.
Date | Action |
---|---|
2025-02-04 | Left in Communications, Technology and Innovation |
2025-01-20 | Subcommittee recommends passing by indefinitely (6-Y 4-N) |
2025-01-17 | Assigned CT & I sub: Communications |
2025-01-16 | Referred from Labor and Commerce and referred to Communications, Technology and Innovation (Voice Vote) |
2025-01-14 | Fiscal Impact Statement from Department of Planning and Budget (HB1817) |
2025-01-06 | Referred to Committee on Labor and Commerce |
2025-01-06 | Prefiled and ordered printed; Offered 01-08-2025 25100801D |
Why Relevant: The bill requires social media platforms to obtain verifiable parental consent for minors under 13, which is an age verification/parental consent feature.
Mechanism of Influence: Social media platforms must implement mechanisms to verify parental consent before allowing users under 13 to create accounts, likely requiring age assessment and identity verification processes.
Evidence:
Ambiguity Notes: The bill does not specify the technical methods for age verification or parental consent; it may allow for flexibility in implementation.
Legislation ID: 9447
Bill URL: View Bill
Senate Bill No. 1214 amends existing laws related to information technology and introduces a new chapter focusing on high-risk artificial intelligence systems. It establishes guidelines for public bodies on the use of these systems, emphasizing the need for policies that address data security, privacy, and algorithmic discrimination. The bill also mandates the inclusion of compliance clauses in procurement contracts for high-risk AI systems.
Date | Action |
---|---|
2025-02-18 | Left in Appropriations |
2025-02-10 | Subcommittee recommends reporting and referring to Appropriations (7-Y 3-N) |
2025-02-10 | Reported from Communications, Technology and Innovation and referred to Appropriations (13-Y 8-N) |
2025-02-07 | Assigned CT & I sub: Communications |
2025-02-07 | Referred to Committee on Communications, Technology and Innovation |
2025-02-07 | Placed on Calendar |
2025-02-07 | Read first time |
2025-02-04 | Constitutional reading dispensed (on 3rd reading) (39-Y 0-N) |
Why Relevant: The bill requires 'risk management policies' and 'impact assessments' for deployers of high-risk AI systems, with ongoing assessment and reporting obligations for public bodies.
Mechanism of Influence: Deployers must complete an impact assessment before deployment and after significant updates, and public bodies must perform ongoing assessments and annual reporting on AI use.
Evidence:
Ambiguity Notes: The requirement is focused on algorithmic discrimination and general risk, not specifically on child safety or online abuse.
Legislation ID: 9007
Bill URL: View Bill
This bill amends existing sections of the Code of Virginia to redefine key terms and introduce new requirements for the handling of childrens personal data. It establishes a new provision that mandates verifiable parental consent before a controller or processor can collect, use, or disclose the personal data of children under 18 years of age.
Date | Action |
---|---|
2025-01-29 | Passed by indefinitely in General Laws and Technology (8-Y 6-N 1-A) |
2025-01-10 | Fiscal Impact Statement from Department of Planning and Budget (SB783) |
2024-12-27 | Referred to Committee on General Laws and Technology |
2024-12-27 | Prefiled and ordered printed; Offered 01-08-2025 25103105D |
Why Relevant: The bill imposes a duty on online service providers to obtain verifiable parental consent before collecting or disclosing personal data of children under 18.
Mechanism of Influence: It mandates age-based gating through parental consent, requiring platforms to implement mechanisms to verify the age of users and the identity of parents.
Evidence:
Ambiguity Notes: The bill does not specify technical details about age verification or whether it applies to all online services or only those directed at children. It lacks explicit requirements for risk assessments, detection/scanning, or ongoing mitigation.
Why Relevant: The bill references compliance with COPPA as sufficient for compliance, suggesting alignment with existing federal privacy protections for children.
Mechanism of Influence: Entities already compliant with COPPA do not need to take additional steps under this law.
Evidence:
Ambiguity Notes: COPPA applies to children under 13, while this bill covers under 18, so there may be a gap in the scope of protections.
Why Relevant: The bill provides definitions and methods for obtaining parental consent, which are relevant for practical implementation.
Mechanism of Influence: Specifies methods such as signed forms, payment systems, or government-issued ID, shaping how platforms must build compliance infrastructure.
Evidence:
Ambiguity Notes: Does not mandate specific technologies or error rates; leaves room for interpretation of 'reasonable efforts.'
This bill establishes a new chapter in Title 19 RCW focusing on the protection of minors in online environments. It defines key terms related to online services and mandates certain practices for businesses that provide online products likely to be accessed by minors. The bill outlines requirements for age verification, privacy settings, and restrictions on data collection and use, particularly concerning minors under 13. It also addresses the use of addictive internet services and sets forth penalties for non-compliance.
Date | Action |
---|---|
2025-03-19 | Returned to Rules Committee for second reading. |
2025-03-06 | Rules Committee relieved of further consideration. Placed on second reading. |
2025-02-28 | APP - Majority; 2nd substitute bill be substituted, do pass. (View 2nd Substitute) (Majority Report) |
2025-02-28 | Minority; do not pass. (Minority Report) |
2025-02-28 | Referred to Rules 2 Review. |
2025-02-28 | Executive action taken in the House Committee on Appropriations at 9:00 AM. (Committee Materials) |
2025-02-28 | Minority; without recommendation. (Minority Report) |
2025-02-26 | Public hearing in the House Committee on Appropriations at 1:30 PM. (Committee Materials) |
Why Relevant: The bill requires businesses to estimate user ages and apply privacy protections for minors, including high privacy defaults and tools for user control.
Mechanism of Influence: Operators must estimate age with 'reasonable certainty' and apply privacy protections if minors are involved; privacy settings for minors must be set to high by default.
Evidence:
Ambiguity Notes: The standard for 'reasonable certainty' in age estimation is not defined. No explicit requirement for robust, technical age verification.
Why Relevant: The bill restricts addictive feeds and requires parental consent for notifications to minors.
Mechanism of Influence: Operators may not provide addictive feeds to minors without verification and parental consent; must provide mechanisms for users to limit exposure.
Evidence:
Ambiguity Notes: The definition of 'addictive feed' may be subject to interpretation.
Why Relevant: The bill prohibits the collection and use of personal information from minors under 13, except for age assurance purposes.
Mechanism of Influence: Businesses must not collect, sell, share, or retain personal information from minors under 13, except as needed for age assurance.
Evidence:
Ambiguity Notes: No technical requirements for how age assurance is performed or data is protected.
This legislation introduces a new chapter to Title 19 RCW, establishing definitions and requirements for online services that cater to minors. It outlines the responsibilities of businesses to ensure the safety and privacy of minors online, including age estimation, data collection limitations, and restrictions on addictive content. The bill also prohibits the use of dark patterns that could manipulate minors and sets forth specific actions businesses must take to protect young users.
Date | Action |
---|---|
2025-02-07 | First reading, referred to Business, Financial Services & Trade. (View Original Bill) |
Why Relevant: The bill requires businesses to estimate the age of users and apply privacy protections accordingly, and if unable to verify, to apply protections to all users.
Mechanism of Influence: This creates an age assessment/verification regime, a core feature of the EU Chat Act.
Evidence:
Ambiguity Notes: The standard is 'reasonable certainty' for age estimation, but does not specify technical means or mandate third-party verification.
Why Relevant: Operators of addictive services must ensure users are not minors before providing addictive feeds.
Mechanism of Influence: This is a form of age gating, requiring verification before access to certain features.
Evidence:
Ambiguity Notes: Does not specify technical standards for verification or define 'addictive feed' in detail.
Why Relevant: Prohibits collection, sale, or retention of personal information from minors under 13, except for age verification.
Mechanism of Influence: Restricts data practices for minors, though not as broad as EU Chat Act's mandatory risk mitigation.
Evidence:
Ambiguity Notes: Allows collection solely for age assurance; does not require ongoing risk assessments or mitigation plans.
Legislation ID: 73505
Bill URL: View Bill
This bill introduces a new article in the Code of West Virginia that mandates manufacturers and developers to implement measures for parental control over software application downloads for users under the age of 16. It establishes definitions related to applications, children, and manufacturers, outlines requirements for obtaining parental consent, and sets penalties for non-compliance. The bill also addresses anticompetitive conduct and enforcement mechanisms related to these new regulations.
Date | Action |
---|---|
2025-03-19 | To House Energy and Public Works |
2025-03-14 | Markup Discussion |
2025-02-20 | To House Enviroment, Infrastructure, and Technology |
2025-02-19 | To Energy and Public Works then Judiciary |
2025-02-19 | Filed for introduction |
2025-02-19 | Introduced in House |
2025-02-19 | To House Energy and Public Works |
Why Relevant: The bill imposes statutory duties on app stores and device manufacturers to implement age determination and parental consent mechanisms, which are core forms of age verification/assessment and app-store gatekeeping.
Mechanism of Influence: Covered manufacturers must 'determine the age of the device's primary user upon activation' and app stores must 'obtain parental consent before allowing downloads by children' (see Empowering Parents to Protect their Children). Developers must provide features to support parental controls for users under 18.
Evidence:
Ambiguity Notes: The bill does not specify technical details for age verification or consent mechanisms, nor does it address detection/scanning, risk assessments, reporting, or encryption.
Legislation ID: 73585
Bill URL: View Bill
This bill amends the Code of West Virginia to create a new chapter focused on child online protection and liability. It defines key terms related to sexual material harmful to minors and outlines the responsibilities of commercial entities in verifying the age of individuals accessing such material. The bill also establishes penalties for non-compliance, including civil actions and damages, while providing exceptions for news organizations and clarifying the role of internet service providers.
Date | Action |
---|---|
2025-03-11 | To House Judiciary |
2025-03-10 | Markup Discussion |
2025-03-10 | To House Legal Services |
2025-02-20 | To Judiciary |
2025-02-20 | Introduced in House |
2025-02-20 | Filed for introduction |
2025-02-20 | To House Judiciary |
Why Relevant: The bill explicitly requires commercial entities to implement 'reasonable age verification' to prevent minors from accessing harmful sexual material.
Mechanism of Influence: Entities must use digital identification or compliant age verification systems for users accessing certain content, with civil liability for noncompliance (see 'Liability for failing to perform reasonable age verification').
Evidence:
Ambiguity Notes: The bill does not specify technical details for age verification, leaving it to rulemaking. No mention of biometric, facial recognition, or specific technology.
Why Relevant: The bill delegates authority to the Office of Technology to establish compliance rules, which could shape how age verification is implemented across platforms.
Mechanism of Influence: Potential for future requirements on compliance infrastructure and process, depending on rulemaking.
Evidence:
Ambiguity Notes: No explicit mandates for audits, legal reps, transparency, or reporting; only age verification compliance.
Legislation ID: 72462
Bill URL: View Bill
Senate Bill 293 seeks to amend the Code of West Virginia by adding a new section that addresses the publishing or distribution of material harmful to minors on the Internet. It defines what constitutes material harmful to minors and establishes requirements for commercial entities to verify the age of individuals accessing such material, thereby aiming to protect minors from exposure to inappropriate content online.
Date | Action |
---|---|
2025-02-12 | Filed for introduction |
2025-02-12 | Introduced in Senate |
2025-02-12 | To Judiciary |
Why Relevant: The bill contains an explicit age verification mandate for access to certain online content, which is a core element of the EU Chat Act's approach.
Mechanism of Influence: It requires commercial entities to verify that users are 18 or older using a 'commercially available database' before allowing access to 'material harmful to minors.'
Evidence:
Ambiguity Notes: The bill does not specify technical details of the verification process or whether it applies to interpersonal communication or hosting services. It is limited to content deemed 'harmful to minors.'
Senate Bill 633, known as the Protect Act, seeks to amend the Code of West Virginia to require mandatory age verification for access to online pornographic content. It defines key terms, outlines the age verification process, prohibits circumvention of these measures, establishes penalties for violations, and provides a legal defense for compliant entities. The bill emphasizes the protection of minors from harmful online material.
Date | Action |
---|---|
2025-02-28 | Introduced in Senate |
2025-02-28 | To Judiciary |
2025-02-28 | Filed for introduction |
Why Relevant: The bill creates a statutory duty for 'commercial adult websites' to implement multiple forms of age verification for access, including biometric and government ID checks.
Mechanism of Influence: Requires platforms to reliably identify adult users and prevent minor access by mandating biometric, ID, and credit card verification, with penalties for non-compliance.
Evidence:
Ambiguity Notes: The bill is explicit about the verification methods but does not extend to general online services, messaging, or social media platforms.
Why Relevant: The bill prohibits circumvention, including the use of VPNs, and establishes heavy penalties for non-compliance.
Mechanism of Influence: Imposes operational bans and large fines on sites that allow minor access or fail to implement required controls.
Evidence:
Ambiguity Notes: Scope is limited to pornographic content providers, not general communication or hosting services.
Legislation ID: 112488
Bill URL: View Bill
This bill introduces regulations concerning the publication and distribution of materials harmful to minors on the Internet. It mandates that business entities implement reasonable age verification methods before allowing access to such materials. The bill defines material harmful to minors and obscene material, and establishes penalties for violations, including civil liability. It also includes exemptions for bona fide news organizations and protects Internet service providers from liability under certain conditions.
Date | Action |
---|---|
2025-10-08 | Public hearing held |
2025-03-21 | Read first time and referred to committee on Mental Health, Substance Abuse Prevention, Children and Families |
2025-03-20 | Assembly Amendment 1 adopted |
2025-03-20 | Read a third time and passed, Ayes 69, Noes 22, Paired 2 |
2025-03-20 | Ordered to a third reading |
2025-03-20 | Assembly Amendment 1 offered by Representative Goeben |
2025-03-20 | Read a second time |
2025-03-20 | Ordered immediately messaged |
Why Relevant: The bill imposes a duty on online businesses to verify users' ages before providing access to certain content, which is a core element of the EU Chat Act's approach to age verification and gating access for minors.
Mechanism of Influence: By requiring 'reasonable age verification methods,' the bill compels covered websites to implement technical or procedural barriers to determine user age, likely affecting privacy and access to online content.
Evidence:
Ambiguity Notes: The scope is limited to 'material harmful to minors' and 'obscene material,' not general interpersonal messaging or all user-generated content. The definition of 'reasonable age verification method' may be broad or left to future interpretation, which could range from self-attestation to government ID checks.
Legislation ID: 112165
Bill URL: View Bill
This legislation prohibits business entities from knowingly publishing or distributing material harmful to minors on the Internet unless they implement reasonable age verification methods. It also defines material harmful to minors and obscene material, outlining the criteria for each. Violations may result in civil liability, and certain exemptions are provided for news organizations and Internet service providers.
Date | Action |
---|---|
2025-10-08 | Public hearing held |
2025-09-10 | Senate Amendment 1 offered by Senator Wanggaard |
2025-03-17 | Representative OConnor added as a cosponsor |
2025-03-14 | Read first time and referred to Committee on Mental Health, Substance Abuse Prevention, Children and Families |
2025-03-14 | Introduced by Senators Wanggaard, Feyen, Jacque and Nass; cosponsored by Representatives Goeben, B. Jacobson, Penterman, Kreibich, Dittrich, Allen, Knodl, Wichgers, Murphy, Brill, Mursau and Behnke |
Why Relevant: The bill explicitly requires 'reasonable age verification methods' for access to certain online content, which is a core element of the EU Chat Act's approach.
Mechanism of Influence: Websites with a substantial portion of material harmful to minors must verify user age, likely requiring users to provide ID or use third-party verification services. This can impact privacy and user rights by requiring disclosure of personal data to access content.
Evidence:
Ambiguity Notes: 'Reasonable age verification methods' is not defined in detail, leaving open what methods are acceptable (ID upload, database checks, etc.). The scope ('substantial portion') may also be subject to interpretation.
Why Relevant: The bill prohibits retention of identifying information post-verification, which is relevant to privacy implications of age verification.
Mechanism of Influence: Websites must design systems that verify age but do not store personal data, which may limit tracking but could also complicate enforcement or re-verification.
Evidence:
Ambiguity Notes: Does not specify what constitutes 'identifying information,' nor how compliance will be audited or enforced.
Why Relevant: Requires prevention of access from known VPN addresses, which is a technical mitigation feature to enforce age verification.
Mechanism of Influence: Websites must use VPN-detection tools to block users attempting to bypass age checks, potentially overblocking or impacting user privacy.
Evidence:
Ambiguity Notes: 'Known VPN addresses' is not defined, and the effectiveness of such blocking is variable.
Legislation ID: 13535
Bill URL: View Bill
This legislation establishes a framework for requiring parental consent for minors under the age of eighteen to create accounts on social media platforms. It defines key terms, mandates social media companies to obtain express consent from a parent or guardian, and outlines the procedures for verification of such consent. Additionally, it empowers the attorney general to enforce compliance through the Wyoming Consumer Protection Act.
Date | Action |
---|---|
2025-02-03 | H Did not Consider for Introduction |
2025-01-02 | H Received for Introduction |
2024-12-05 | Bill Number Assigned |
Why Relevant: The bill imposes a direct age-verification and parental-consent obligation on social media platforms, which is a core element of the EU Chat Act's approach to risk mitigation for minors.
Mechanism of Influence: Social media platforms must implement systems to reliably verify age and obtain parental consent for minors, and deny access if requirements are not met.
Evidence:
Ambiguity Notes: The summary does not specify the technical details of verification, nor does it require detection/scanning, reporting, mitigation plans, or address encryption.
Legislation ID: 13544
Bill URL: View Bill
The bill introduces regulations requiring websites that contain material harmful to minors to implement age verification measures. It specifies the definitions of relevant terms, outlines the responsibilities of covered platforms, and establishes legal remedies for violations, including civil actions for damages. The bill also clarifies its applicability and exceptions, ensuring that it aligns with existing laws and constitutional provisions.
Date | Action |
---|---|
2025-03-13 | Assigned Chapter Number 139 |
2025-03-13 | Governor Signed HEA No. 0070 |
2025-03-05 | S President Signed HEA No. 0070 |
2025-03-05 | H Speaker Signed HEA No. 0070 |
2025-03-04 | Assigned Number HEA No. 0070 |
2025-03-04 | S 3rd Reading:Passed 28-3-0-0-0 |
2025-03-03 | S 2nd Reading:Passed |
2025-02-28 | S COW:Passed |
Why Relevant: The bill imposes a duty on covered platforms to implement age verification for users accessing harmful material, directly addressing the 'age verification/assessment' element.
Mechanism of Influence: Covered platforms must use reasonable methods to verify user age and are prohibited from retaining identifying information after granting access. This affects user privacy and access rights but does not extend to broader risk management or scanning obligations.
Evidence:
Ambiguity Notes: 'Reasonable age verification methods' are not defined in detail; the technological means are left open, which could range from self-attestation to more intrusive checks, depending on interpretation or later regulation.