Meta has announced a series of new tools and updates designed to strengthen protections for teens and children across its platforms, including Instagram and Facebook. These updates feature improvements to direct messaging for teen users, expanded nudity protection, and enhanced safeguards for adult-managed accounts that primarily showcase children.
Teen accounts will now display more contextual information about who users are messaging, including safety tips, account creation date, and a new option to block and report a message sender in a single action. In June alone, teens utilized Meta’s safety notices to block one million accounts and reported another one million after encountering a safety notice.
Meta is also taking proactive steps to combat cross-border sextortion scams by rolling out a “Location Notice” feature on Instagram. Over 10% of users tapped on this notice to learn more about preventative measures. This alert helps individuals identify when they’re chatting with someone located in another country, a tactic often employed to exploit young people.
Nudity protection, a key feature that automatically blurs suspected nude images in DMs, continues to be widely adopted. As of June, 99% of people, including teens, kept this setting enabled. The feature has also significantly reduced the likelihood of people forwarding explicit content, with nearly 45% deciding against sharing such images after receiving a warning.
For accounts managed by adults that prominently feature children—such as those run by parents or talent agents—Meta is now rolling out teen account-level protections. These include the activation of strict message controls and “Hidden Words” to automatically filter offensive comments, aiming to prevent unwanted or inappropriate contact before it occurs.
Furthermore, Meta will reduce the visibility of these accounts to suspicious individuals, making them harder to find via search or recommendations. This complements previous efforts, such as removing the ability for such accounts to accept gifts or offer paid subscriptions.
In an ongoing crackdown, Meta removed nearly 135,000 Instagram accounts for leaving sexualized comments or requesting sexual images from adult-managed accounts featuring children under 13. Additionally, 500,000 more Facebook and Instagram accounts linked to those original accounts were also removed. Meta is actively collaborating with other tech companies through the Tech Coalition’s Lantern program to prevent harmful actors from reappearing on other platforms.

