Meta shuts down research showing its platforms are bad for mental health

Newly released documents show that Meta scrapped internal research into the mental health effects of Facebook and Instagram, once it became clear that the results didn’t look good for them.
A post from Time Magazine on X sums up the brutal findings: “Meta was aware that millions of adult strangers were contacting minors on its sites; that its products exacerbated mental health issues in teens; and that content related to eating disorders, suicide, and child sexual abuse was frequently detected, yet rarely removed.”
The 2020 study, dubbed Project Mercury and done in collaboration with survey firm Nielsen, aimed in part to gauge the effect of deactivating Facebook and Instagram accounts. Internal documents reveal what it found: “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison.”
Meta’s response was to shut down the research. CNBC quotes Andy Stone, a spokesperson for Meta, who claims the study was shut down because of flawed methodology, and says the allegations “rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture.”
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens – like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences.”
However, a legal brief shows that Nielsen staff assured Meta the findings were valid, and that some even compared sleeping on the evidence to the tobacco industry’s cover-up of the harms of cigarettes.
Court filing on behalf of school districts says Meta lets sex traffickers run amok
The Meta allegations arrive as part of a larger lawsuit filed last week by the law firm Motley Rice, which also targets Google, TikTok and Snapchat on behalf of school districts across the U.S. Among the various allegations, court filings show TikTok trying to hijack the National Parent Teacher Association (PTA) to gain influence over its governance.
But, says a report from InnovationAus, by and large, “the allegations against the other social media platforms are less detailed than those against Meta,” which accuse Mark Zuckerberg’s tech giant of intentionally designing ineffective youth safety features; blocking testing; optimizing its products to increase teen engagement despite knowing it would serve kids more harmful content; and stalling internal efforts to prevent child predators from contacting minors on the grounds that it would hurt growth.
The headlining allegation, however, has to be that Meta’s policy was to give people caught using the site for human sex trafficking 16 do-overs – which is to say, it allowed for 17 such violations before removing an account. A document described this as “a very, very, very high strike threshold.”
Zuckerberg’s approach to child safety is evident in a text in which he claims it’s not his top concern, “when I have a number of other areas I’m more focused on like building the metaverse.” (Statista estimates that Meta’s Reality Labs division has lost a total of 70 billion dollars to date.)
Meta has told Congress it has no ability to quantify whether its products harm teens. The court filings suggest it lied. That’s probably why Meta has filed a motion to strike the documents.
A hearing is set for January 26 in Northern California District Court.
Article Topics
age verification | children | Facebook | Instagram | Meta | social media







Comments