State Kids Online Safety Act
The State Kids Online Safety Act enhances protections for minors online by imposing duties of care on platforms, requiring parental tools for monitoring, and prohibiting harmful design practices. This legislation aims to safeguard children and teens from harmful content, protect their mental well-being, and ensure accountability for online platforms. Transparency and reporting measures, coupled with enforcement by the State Attorney General and a private right of action, ensure compliance and offer recourse for violations.
Key Provisions
Platform Duty of Care: Requires platforms to act in the best interests of minors, prevent exposure to harmful content, and limit algorithmic amplification of such material.
Parental Tools and Controls: Mandates user-friendly tools for parents to monitor online activity, set time limits, and restrict content access at no additional cost.
Transparency and Reporting Requirements: Platforms must publish annual reports on risks to minors and measures taken to mitigate those risks. Independent researchers must have access to anonymized data for risk assessment.
Prohibition on Harmful Design Practices: Bans design features that exploit minors’ vulnerabilities, promote harmful content, or collect minors’ data without parental consent.
State Oversight and Enforcement: The State Attorney General oversees compliance, with civil penalties of up to $7,500 per affected minor per violation and injunctive relief for non-compliance.
Private Right of Action: Parents or guardians may file civil actions for actual and statutory damages of up to $2,500 per violation, including attorneys’ fees.
Model Language
Section 1: Title: This Act shall be known as the “State Kids Online Safety Act.”
Section 2: Findings and Purpose
Findings:
Children and teens face increased risks from harmful content, addictive platform designs, and privacy invasions online.
Protecting minors from exposure to harmful content such as cyberbullying, self-harm, eating disorders, and sexual exploitation is a compelling public interest.
Parents and guardians require tools to monitor and guide their children’s online activities effectively.
Purpose: To protect the physical and mental well-being of minors online by enhancing accountability for online platforms, empowering families, and establishing safeguards against harmful practices.
Section 3: Definitions
“Minor” means any individual under the age of 18.
“Covered Platform” refers to any online platform, website, or application with over 1 million monthly active users operating within the state.
“Harmful Content” includes but is not limited to:
a. Content promoting self-harm, suicide, eating disorders, substance abuse, or sexual exploitation.
Cyberbullying or harassment.
Content promoting violence or hate speech.
“Algorithmic Recommendations” refers to the use of algorithms to deliver personalized content to users.
Section 4: Platform Duty of Care
Covered platforms shall act in the best interests of minors to protect them from harmful content and ensure a safe online environment.
Platforms shall take reasonable steps to:
Prevent and mitigate exposure to harmful content.
Limit algorithmic amplification of harmful content.
Protect minors’ mental health and well-being.
Section 5: Parental Tools and Controls
Platforms shall provide easily accessible tools enabling parents or guardians to:
Monitor their child’s online activity.
Set time limits for usage.
Restrict access to certain types of content.
Platforms shall ensure that tools are user-friendly and provided at no additional cost.
Section 6: Transparency and Reporting Requirements
Platforms shall publish annual reports detailing:
Risks identified to minors’ health and safety.
Measures taken to address those risks.
The impact of platform algorithms on minors.
Platforms shall provide independent researchers with access to anonymized data for assessing risks to minors.
Section 7: Prohibition on Harmful Design Practices: Platforms shall not employ design features intended to:
Exploit minors’ vulnerabilities to increase engagement or screen time.
Promote harmful content through algorithmic amplification.
Collect or share minors’ data without explicit consent from a parent or guardian.
Section 8: State Oversight and Enforcement
The State Attorney General shall oversee compliance with this Act.
Violations shall be subject to:
Civil penalties of up to $7,500 per affected minor per violation.
Injunctive relief to prevent further violations.
The Attorney General may work with other state agencies to:
Investigate complaints.
Conduct audits of covered platforms.
Section 9: Private Right of Action: Parents or guardians of affected minors may bring a civil action against platforms for:
Actual damages.
Statutory damages of up to $2,500 per violation.
Reasonable attorneys’ fees and costs.
Section 10: Severability: If any provision of this Act is found to be invalid, the remaining provisions shall remain in full force and effect.
Section 11: Effective Date: This Act shall take effect 180 days after its enactment.