Social media giant Meta is expanding its "Teen Accounts" feature to Facebook and Messenger globally, after initially only being available to users in the U.S., U.K, Australia, and Canada.
The accounts, which feature built-in protections and parental controls for younger users, first launched on Instagram last September, are designed to automatically enforce privacy settings, content restrictions, and parental controls.
Teen Accounts were originally launched shortly after Meta and other popular social networks were grilled by U.S. lawmakers for not doing enough to protect teens on their services.
With the global expansion on Facebook and Messenger, teens will now automatically be placed into an experience that is designed to limit inappropriate content and unwanted contact. Teens under the age of 16 need their parents’ permission to change any of the settings.
Additionally, teens will only receive messages from people they follow or have messaged before. Only teens’ friends can see and reply to their stories. Plus, tags, Mentions, and comments will also be limited to people they follow or who are their friends.
Teens will also receive reminders to leave the social networks after using them for an hour a day, and they’ll be enrolled in “Quiet mode” overnight.
"We know parents are worried about strangers contacting their teens – or teens receiving unwanted contact. In addition to the existing built-in protections offered by Teen Accounts, we’re adding new restrictions for Instagram Live and unwanted images in DMs. With these changes, teens under 16 will be prohibited from going Live unless their parents give them permission to do so. We’ll also require teens under 16 to get parental permission to turn off our feature that blurs images containing suspected nudity in DMs." Meta said.
"We want to make it easier for parents to have peace of mind when it comes to their teens’ experiences across Meta’s apps. Teen Accounts on Facebook and Messenger will offer automatic protections to limit inappropriate content and unwanted contact, as well as ways to ensure teens’ time is well spent."
The expansion of Teen Accounts comes as research led by a Meta whistleblower recently found that children and teens are still at risk from online harm on Instagram, even after the company has rolled out protections. The study found that despite being placed into Teen Accounts, young users can still come across suicide and self-harm posts, along with posts describing demeaning sexual acts. Meta has disputed the claims and said its protections have led to teens seeing less harmful content.
Meta also announced on Thursday that it’s officially launching the School Partnership Program, which allows educators to report safety concerns, like bullying, directly to Instagram for quicker review and removal.
The company says it piloted the program earlier this year and heard positive feedback from participating schools. Now, all middle and high schools in the U.S. can sign up for the program to receive prioritized reporting and education resources. Schools that are part of the program will receive a banner on their Instagram program to notify parents and students that they are an official Instagram partner.
Thursday’s announcement marks Meta’s latest step toward addressing teen mental health concerns tied to social media and a major step in it's broader initiative to safeguard young users across its social media ecosystem.

Shaping the Future of Teen Safety Across Meta Platforms
Looking ahead, Meta plans to continue enhancing Teen Account features and expanding its safety measures. The company is committed to listening to both teens and parents to fine-tune its approach to online safety, ensuring that young people have a secure, positive experience across Instagram, Facebook, and Messenger. With further rollouts and ongoing improvements, Meta aims to set a new standard for social media platforms in terms of teen protection and safety.
As the digital landscape continues to evolve, Meta's efforts to safeguard its younger audience reflect a growing trend across the tech industry to prioritize user safety, particularly for vulnerable groups like teens. By integrating comprehensive privacy tools, parental controls, and content moderation features, Meta is taking a proactive stance in promoting a healthier, more secure online environment for all users.







