Recently unsealed court filings in a major lawsuit against Meta, TikTok, Snapchat, and YouTube have revealed internal staff communications that compare their own platforms to addictive drugs. Employees reportedly used phrases such as “IG is a drug” and “we’re basically pushers,” highlighting growing concerns that social-media companies knowingly built products that exploit teen vulnerabilities for profit.
The documents include thousands of pages of internal emails, research reports, and product discussions. They suggest that the companies understood how features like infinite scrolling, algorithmic recommendations, “likes,” streaks, and push notifications could fuel compulsive use—especially among adolescents whose brains are more sensitive to reward loops and peer comparison.
The filings also indicate that proposed safety features, such as hiding like-counts or strengthening teen privacy defaults, were delayed or rejected when they threatened key business metrics like engagement and user growth. Some internal teams reportedly warned of mental-health impacts, but those recommendations were not prioritized.
School districts involved in the lawsuit argue that the platforms have contributed to rising rates of anxiety, depression, cyberbullying, and classroom disruption. They claim schools have been forced to divert resources to support students struggling with the emotional and behavioral fallout of constant online engagement.
For parents, educators, and youth organizations, the revelations underscore a critical truth: social-media platforms are not neutral tools. Their design choices shape teen behavior, attention, and emotional well-being—and internal documents suggest those choices were knowingly optimized for maximum time-spent, not safety.
As policymakers consider stronger regulations and platforms face mounting legal pressure, these filings have intensified calls for greater transparency, safer design standards for minors, and stronger digital-wellness education.