Meta introduces new safety measures for Instagram users under 16, requiring parental consent for livestreaming and unblurring nudity, with changes extending to Facebook and Messenger.
![]() |
Meta Platforms expands its teen safety measures on Instagram, Facebook, and Messenger, blocking live streaming and nudity unblur features for users under 16 without parental approval. Image: FP/ CH |
Menlo Park, California, USA — April 9, 2025:
Instagram users under 16 will soon face new restrictions on the platform as Meta Platforms rolls out expanded safety measures aimed at protecting younger users. Effective immediately, teens will need parental consent to livestream or unblur nudity in direct messages they receive on Instagram. The move is part of a broader effort by Meta to increase oversight of teen activity across its platforms, including Instagram, Facebook, and Messenger.
The social media giant introduced its teen account program for Instagram in September, which allows parents more control over their children's online presence in response to growing concerns about the impact of social media on young people’s mental health. This latest update follows that initiative, further strengthening protections for users under 18.
In the first phase, these changes will be implemented for users in the United States, United Kingdom, Canada, and Australia. Meta has confirmed that it will expand the safety features to global users in the coming months.
Under the new rules, teens under 16 will no longer be able to use Instagram Live unless they have received explicit parental consent. Additionally, they will be unable to disable the feature that blurs images containing suspected nudity in direct messages without permission from a parent.
Meta is also extending these protections to its Facebook and Messenger platforms. Teen accounts on both platforms will have similar automatic safeguards as those on Instagram, including setting accounts to private by default, blocking private messages from unknown contacts, limiting exposure to sensitive content like violent videos, and implementing time management reminders to encourage breaks after 60 minutes of app use. Teen accounts will also be subject to restrictions on notifications during bedtime hours.
"Teen Accounts on Facebook and Messenger will offer similar, automatic protections to limit inappropriate content and unwanted contact, as well as ways to ensure teens’ time is well spent," Meta stated in a blog post.
Since the launch of the teen account program in September, over 54 million teen accounts have been set up across Meta’s platforms. The company emphasized that these new measures are part of an ongoing commitment to ensure a safer online environment for young users, aligning with broader societal concerns about online safety and digital well-being for teens.
Meta’s expansion of these safeguards comes amid mounting scrutiny over social media's role in the lives of young people, as well as growing pressure from parents, regulators, and advocacy groups calling for better protections.
Source: With input from agency