Instagram is Pausing Development on a Version for Kids Under 13 Years Old

The parent-controlled version, which would've make Instagram accessible to those under 13 years old, is coming under scrutiny from lawmakers and regulators.

Earlier this year, Instagram announced plans to develop a new version of its social media platform for those under 13 years old, according to internal documents obtained by BuzzFeed News. Now "Instagram Kids" is being put on hold amid concerns about youth mental health.

"While we stand by the need to develop this experience, we've decided to pause this project. This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today," said Adam Mosseri, Head of Instagram, in a statement published on September 27.

The news comes after The Wall Street Journal revealed that researchers from Facebook, which owns Instagram, have repeatedly found that the platform is toxic for teenage girls, especially when it comes to mental health and body image. The U.S. Senate is scheduled to hold a hearing about this internal research, entitled "Protecting Kids Online: Facebook, Instagram, and Mental Health Harms," on September 30.

An image of the Instagram app on a colorful background.
Getty Images. Art: Jillian Sellers.

Instagram Kids was announced in May 2021; it was intended to provide an appropriate experience for tweens ages 10-12, who were using the app despite age restrictions. (Children must currently provide their age while signing up for an Instagram account, and those under 13 can't use the photo-sharing app without verifiable parental consent.) The special-purpose app would be managed by parents, and it wouldn't show any advertisements.

The idea received backlash from the start. Shortly after plans were announced, attorneys general from 44 states and territories urged Facebook to renounce Instagram Kids. Indeed, the National Association of Attorneys General wrote an open letter to Facebook CEO Mark Zuckerberg expressing their grievances.

"Use of social media can be detrimental to the health and well-being of children, who are not equipped to navigate the challenges of having a social media account. Further, Facebook has historically failed to protect the welfare of children on its platforms," the letter said. Writers also stated: "It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account."

Facebook responded to the letter in defense of its plan. "As every parent knows, kids are already online. We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing," according to a statement by a Facebook spokesperson. "We are developing these experiences in consultation with experts in child development, child safety and mental health, and privacy advocates. We also look forward to working with legislators and regulators, including the nation's attorneys general."

This isn't the first time that social media companies have tried catering to a younger audience. In 2017, Facebook launched Messenger Kids, a parent-controlled video calling and messaging app for children between the ages of 6 and 12. The app received plenty of criticism from parents and consumer-privacy advocates—especially after a 2019 "technical error" allowed them to join chats with strangers. Another app for children, YouTube Kids, has also faced controversy because kids can navigate off filters and find inappropriate content.

Was this page helpful?
Related Articles