Meta shut down internal research into how Facebook and Instagram affect users’ mental health after discovering causal evidence that its platforms were harming people, according to newly unredacted court filings in a class-action lawsuit brought by US school districts against Meta and other social media companies.
The documents reveal that in 2020, Meta launched an internal study called “Project Mercury,” in which its researchers worked with Nielsen to measure the impact of users “deactivating” Facebook and Instagram. The results were not what the company had hoped for. According to the filings, internal documents said “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison.”
Instead of releasing the findings or conducting further research, Meta allegedly halted the project and internally argued that the negative results were influenced by the “existing media narrative” surrounding the company.
Behind closed doors, however, staff privately assured Nick Clegg, then Meta’s head of global public policy, that the research was sound.
One staff researcher wrote that “The Nielsen study does show causal impact on social comparison,” adding an unhappy-face emoji. Another employee reportedly compared staying silent about the findings to the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.”
Despite its own internal evidence pointing to a causal link between its products and negative mental health effects, the filing claims Meta told Congress it had no way to determine whether its platforms harmed teenage girls.
In a statement on Saturday, Meta spokesperson Andy Stone said the company dropped the study because its methodology was flawed and insisted Meta has consistently worked to make its products safer.
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.
Quick Reads
View AllPlaintiffs say tech companies concealed known risks
The accusation that Meta buried damaging research is one of many contained in a late-Friday filing by Motley Rice, the law firm representing school districts suing Meta, Google, TikTok and Snapchat. Broadly, the plaintiffs argue that the companies intentionally withheld information about risks they themselves had identified, keeping parents, teachers and users in the dark.
TikTok, Google and Snapchat did not immediately comment.
The filing alleges the platforms did everything from allowing children under 13 to use their services, to failing to tackle child sexual abuse material, to actively promoting the use of their apps by teenagers during school hours. Plaintiffs also claim the companies tried to pay child-focused organisations to publicly defend the safety of their platforms.
In one example, TikTok sponsored the National PTA and then, according to internal messages cited in the filing, celebrated its ability to sway the organisation. TikTok officials reportedly wrote that the PTA would “do whatever we want going forward in the fall… (t)hey’ll announce things publicly(,), (t)heir CEO will do press statements for us.”
)