Meta has introduced new measures to improve teen safety on Instagram by requiring parental permission for livestreaming among users under 16. This update is part of a broader effort to expand protections across Instagram, Facebook, and Messenger. The new features aim to give parents greater control over their children’s online interactions, providing an added layer of security for younger users. These changes come as Meta continues its efforts to comply with child safety regulations, particularly in countries like the UK, US, Canada, and Australia.
Stronger Parental Controls Across Platforms
Instagram’s new update introduces several parental control features designed to protect teens from potentially harmful content and interactions. Among the changes, teens under 16 will need parental consent to use livestreaming. Additionally, the platform will require parental approval for teens to disable a nudity-blurring feature in direct messages.
These measures are part of a broader effort to build on the “teen account” settings introduced last year. These settings allow parents to:
- Set daily time limits for app usage
- Restrict access to the app during specific hours
- Monitor messaging activity
Similar changes are now being extended to Facebook and Messenger accounts in select countries, including the UK, US, Canada, and Australia. For teens under 16, parental approval will be necessary to adjust privacy and safety settings, while 16- and 17-year-olds will still be able to modify settings independently.
Child Safety Groups Call for More
Meta’s initiative to tighten controls on teen accounts is being met with cautious approval from child safety groups. The National Society for the Prevention of Cruelty to Children (NSPCC) has praised the company for expanding its protective features. However, the charity stresses that further steps are necessary.
Matthew Sowemimo, head of child safety policy at NSPCC, commented that while the update is a step in the right direction, more needs to be done. “These changes need to be backed by stronger prevention strategies,” he said, emphasizing the need for Meta to go beyond merely restricting settings and take more proactive steps to block harmful content altogether.
Despite the push for further action, Meta reports that 54 million teens worldwide are currently using Instagram’s teen settings, with 90% of 13–15-year-olds sticking to the platform’s default protective measures. While this indicates a positive trend, child safety advocates continue to call for greater accountability from the platform.
Comes as UK Enforces New Online Safety Laws
Meta’s updated safety measures also coincide with the recent rollout of the UK’s Online Safety Act. This law requires tech platforms to take more proactive steps in blocking illegal content and ensuring that users under 18 are shielded from harmful material, including content related to self-harm or suicide.
The timing of these updates aligns with growing global efforts to ensure that social media platforms protect young users from the dangers of online exposure. Meta’s move is part of a wider shift in the digital landscape aimed at empowering parents and caregivers to better manage their children’s online experiences. However, critics warn that tech companies must remain vigilant and not allow external pressures—such as trade negotiations—to dilute the strength of these protections.
As Meta continues to refine its approach to teen safety, it faces ongoing scrutiny from child protection organizations and regulatory bodies. The company’s latest update is a positive step, but the call for more robust safeguards remains strong. Whether Meta will take further steps to protect teens from harmful content is something to watch in the coming months.