The Wall Street Journal reviewed a variety of internal documents that show the social media company knows just how toxic Instagram can be for teens—young women in particular—and critics say their efforts to rectify the problem continue to fall short.

Advertisement
Young woman lying on sofa and using cell phone at home
Credit: Getty Images

We've known for some time that social media can be toxic for teens, especially young women, and as a particularly visual platform, Instagram is especially problematic. A study published in 2019 in Psychology of Popular Media Culture found that in young women, the frequency of Instagram use correlated with depressive symptoms, self-esteem, general and physical appearance anxiety, and body dissatisfaction. Now, a bombshell report in the Wall Street Journal shows that Facebook, which owns Instagram, is well aware of the teen mental health crisis—and they've been purposely playing it down.

WSJ reports that for the past three years, Facebook employees who work in data science, marketing, and productive development have been researching exactly how the app affects young people. After all, more than 40 percent of Insta's users are 22 years old and younger, and about 22 million teens use the app daily, according to the site.

In a March 2020 slide presentation posted to Facebook's internal message board, researchers pointed out that 32 percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse. "Comparisons on Instagram can change how young women view and describe themselves," they noted.

Several of the many troubling conclusions that the WSJ uncovered from the internal documents:

  • "We make body image issues worse for one in three teen girls."
  • "Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups."
  • "Social comparison is worse on Instagram" than on TikTok and Snapchat.
  • The Explore page, which serves up photos and videos curated by an algorithm, points users toward particularly problematic content.
  • More than 40 percent of Instagram users who reported feeling "unattractive" said the feeling began on the app.
  • "Sharing or viewing filtered selfies in stories made people feel worse."
  • Because Instagram is built to share "only the best moments" and perpetuates "pressure to look perfect," teens are sent "spiraling toward eating disorders, an unhealthy sense of their own bodies, and depression," according to March 2020 internal research.
  • While teen girls are notably suffering, Facebook's research shows that 14 percent of boys in the U.S. said Instagram made them feel worse about themselves. And 40 percent of teen boys experience negative social comparison.

And yet, as the WSJ points out, Facebook, which is currently working on a version of Instagram for children under 13, has played down these negative effects. "The research that we've seen is that using social apps to connect with other people can have positive mental health benefits," CEO Mark Zuckerberg said at a March 2021 congressional hearing.

In response to the WSJ story, Instagram's head of public policy Karina Newton published a blog post, noting that the company is researching ways to pull users away from dwelling on certain types of Instagram posts.

"We're exploring ways to prompt them to look at different topics if they're repeatedly looking at this type of content," Newton said. "We're cautiously optimistic that these nudges will help point people towards content that inspires and uplifts them, and to a larger extent, will shift the part of Instagram's culture that focuses on how people look."

But those efforts are too little, too late for many parents and lawmakers. For instance, in light of the WSJ story, Rep. Lori Trahan, D-Mass., is calling for Facebook to "immediately abandon plans for Instagram for Kids."

"Facebook's internal documents show that the company's failure to protect children on Instagram—especially young girls—is outright neglect, and it's been going on for years," notes Trahan in a statement posted, ironically, to Facebook. "Facebook has no business developing additional social media platforms explicitly designed for our children when they can't be trusted to keep their current house in order."

She concludes, "It's also clear that Facebook's refusal to release its full internal research on how users are impacted on its platforms is VERY problematic. Without complete data access, independent researchers can't fully detail the harm users face. That must change."