Meta announced on Thursday that it is rolling out its “teen accounts” globally for Facebook and Messenger, after first introducing them in select English-speaking countries.
According to the company, “We’ve placed hundreds of millions of teens in Teen Accounts across Instagram, Facebook, and Messenger, and now we’re expanding them to teens around the world on Facebook and Messenger.”
These specialized accounts, aimed at users between 13 and 17 years old, come with built-in safety features, stricter content controls, and parental oversight. Meta initially launched the feature on Instagram last year before extending it in April to Facebook and Messenger in the US, Canada, Australia, and the UK.
The company explained that restrictions on these accounts are “designed to address parents’ top concerns with automatic protections to limit who their teens are talking to online and the content they’re seeing, and ensure their time is well spent.” For those under 16, these safeguards cannot be lifted without a parent’s approval.
The protections include default settings that automatically limit who can interact with teenagers, tighten visibility of their online activity, and restrict exposure to potentially harmful or sensitive content. Teen accounts are also designed to help young users manage their time more effectively, nudging them to take breaks and preventing them from spending long, uninterrupted hours scrolling through feeds.
Meta’s move comes in response to mounting criticism directed at major tech companies over the influence of social media on youth. Parents, educators, and policymakers have long raised alarms about online risks such as cyberbullying, unwanted contact from strangers, and the mental health impact of excessive screen use. By giving parents more control and placing stronger safeguards around younger users, Meta is signaling that it wants to be proactive in addressing these concerns.
Experts say the expansion of teen accounts could serve as a test case for how technology firms balance growth with responsibility. While some praise the initiative as a step in the right direction, others argue that enforcement and monitoring will be key to making sure the protections actually work as intended.
For Meta, which owns Facebook, Instagram, and WhatsApp, the rollout reflects broader efforts to rebuild trust with parents and regulators worldwide. The company has faced lawsuits, government investigations, and public scrutiny over how its platforms handle children’s data and the addictive nature of its services. With teen accounts now expanding globally, Meta is positioning itself as more responsive to ongoing debates about digital safety and the well-being of younger generations.

















