Milwaukee Joins Growing Number of Districts Seeking Damages from Tech Giants
Milwaukee Public Schools joined a nationwide legal battle against social media companies on January 12, 2025, filing a class-action lawsuit that claims platforms like Instagram, TikTok, and Snapchat deliberately harm children’s mental health. The district’s action follows similar suits from at least nine other school systems across six states, including a landmark case from Seattle that accuses tech giants of creating a public health crisis. These legal challenges emerge as social media companies face intensifying scrutiny over their impact on youth mental health, with schools reporting unprecedented demands for student mental health services.
5 Key Points
- Milwaukee Public Schools filed its social media lawsuit on January 12, 2025, targeting Instagram, Google, YouTube, Snapchat, and TikTok.
- School districts claim social media platforms deliberately create addictive algorithms that harm students’ mental health.
- An early case in California revealed Meta pushed harmful content to a 12-year-old, leading to an eating disorder.
- Studies link extensive social media use to depression, anxiety, and eating disorders in youth.
- About 40% of children between ages 8-12 use social media despite age restrictions of 13 and older.
A Wave of School District Lawsuits Targets Social Media Giants
The Milwaukee Public Schools’ legal action represents the latest development in a growing movement of school districts challenging social media companies through the courts. At least nine other districts across Arizona, California, New Jersey, Oregon, Pennsylvania, and Washington have filed similar lawsuits. These districts claim social media platforms deliberately design addictive algorithms that harm students’ mental health while placing additional burdens on school resources. The Seattle School District pioneered this approach with its public nuisance claim against Meta, Snap, Google, and ByteDance, citing unprecedented demands for student mental health services.
Early Cases Reveal Patterns of Harm to Young Users
The lawsuit filed by Alexis Spence’s parents in Northern California exposed concerning practices at Meta, Instagram’s parent company. Spence created an Instagram account at age 12, bypassing age verification systems without parental knowledge. Her initial searches for fitness content led to algorithmic recommendations promoting dieting and eating disorders. This case gained additional significance when a Meta whistleblower presented documents showing the company’s awareness of its platforms’ negative impact on young users. Despite Meta’s claims that it prohibits content promoting self-harm, suicide, or eating disorders, plaintiffs argue harmful content continues to reach children through recommendation algorithms.
Mental Health Crisis Strains School Resources
School districts report alarming increases in mental health-related emergencies among students who frequently use social media platforms. The legal complaints describe rising rates of emergency room visits, classroom disruptions, absenteeism, and tardiness. These behavioral changes mirror addiction patterns similar to those seen with cigarettes and other harmful substances. Districts must now allocate additional resources to address depression, anxiety, eating disorders, and other mental health conditions that link to social media use. The Milwaukee Public Schools statement emphasizes its role as “one of the primary providers of mental health services to youth,” arguing the district “has been directly harmed by Defendants’ conduct.”
Youth Access Raises Age Verification Concerns
Despite platform policies restricting access to users 13 and older, studies show approximately 40% of children between ages 8 and 12 maintain active social media accounts. Parents and school officials point to ineffective age verification systems and inadequate parental controls as key factors enabling underage access. The lawsuits claim social media companies design these systems to be easily circumvented, allowing young children extended exposure to age-inappropriate content and features. This exposure occurs during critical developmental periods when children lack adult-level self-control capabilities, making them particularly vulnerable to algorithmic manipulation.
States Begin Legislative Response
Some states have begun implementing protective measures while these lawsuits proceed through the courts. Utah enacted regulations requiring social media platforms to disable teen accounts between 10:30 p.m. and 6:00 a.m. Other states have introduced stricter age-verification requirements for account creation. These legislative efforts aim to address immediate concerns while broader legal challenges seek long-term changes in how social media companies interact with young users. The Supreme Court’s pending decision on TikTok’s national security case may also influence future regulation of social media platforms.
Gender Disparities in Social Media Impact
Research cited in the lawsuits indicates that while social media addiction affects all youth, young girls face heightened risks for specific mental health challenges. The complaints detail how platforms’ content algorithms can intensify body dysmorphia, eating disorders, and self-esteem issues among female users. However, male users also report significant mental health impacts from extensive platform use. The lawsuits argue these gender-specific harms demonstrate how platforms’ algorithms target and exploit psychological vulnerabilities in different user demographics.