Leaked internal documents have exposed how Meta prioritized teen engagement over safety, revealing that the company knowingly targeted vulnerable youth despite internal research confirming a negative impact on their well-being.
Internal Documents Reveal a Pattern of Neglect
A 2019 study report surfaced in recent legal filings, showing that Meta conducted 24 in-depth interviews with users identified as having “problematic” usage patterns—a group representing roughly 12.5% of its user base. The report explicitly stated: “The best external research indicates that Facebook’s impact on people’s well-being is negative.”
Additional documents highlight direct involvement from leadership. CEO Mark Zuckerberg noted that for Facebook Live to gain traction with teens, the company would need to be “very good at not notifying parents / teachers,” as detailed in these comments.
Prioritizing Retention Over Well-being
Meta employees frequently discussed aggressive strategies to keep teens tethered to their phones. One employee, in an email to CPO Chris Cox, joked about optimizing the platform for “sneaking a look at your phone in the middle of Chemistry.”
Max Eulenstein, Meta’s VP of Product, admitted in a 2021 email that the company’s product teams were actively working to maximize daily app opens, acknowledging that this was not a user-driven desire but a corporate objective.
Meta’s Defense and Current Safety Measures
In response to the disclosures, a Meta spokesperson claimed many of the documents are nearly a decade old. The company highlighted the 2024 launch of “Instagram Teen Accounts,” which feature private-by-default settings, restricted tagging, and mandatory 60-minute time limit reminders for users under 16.
Whistleblower Testimony and Legislative Challenges
Kelly Stonelake, a former Meta Director of Product Marketing, confirms that these internal revelations reflect her own experiences at the company. Stonelake, who currently has a lawsuit against Meta alleging discrimination, claims her warnings regarding the lack of content moderation in Horizon Worlds were systematically ignored.
The U.S. government has intensified its scrutiny of Meta since the 2021 Frances Haugen leaks. However, proposed legislative solutions remain controversial. While the Kids Online Safety Act has gained momentum, privacy activists warn that such laws could lead to excessive censorship and adult surveillance.
Stonelake has since turned against the current version of the bill, citing preemption clauses that could shield tech companies from state-level accountability. “There is language in the latest version that would close the courthouse doors to school districts, to bereaved families, to states,” she warned. She advocates for nuanced, complex solutions rather than the polarized political rhetoric currently dominating the debate.
