Skip to main content

Already a subscriber? Make sure to log into your account before viewing this content. You can access your account by hitting the “login” button on the top right corner. Still unable to see the content after signing in? Make sure your card on file is up-to-date.

Meta has announced an update to its youth safety and privacy policies, focusing on restricting content related to suicide, self-harm, and eating disorders for teenage users.

The new measures expand upon existing policies by barring recommendations of such content and actively hiding it from teens’ feeds and stories, even if it’s shared by accounts they follow.

Meta Threads

In addition to content restrictions, the update includes placing all teenage users in more stringent content control settings on Instagram and Facebook. This change, previously applied to new teen accounts, will now extend to existing teen users. Meta’s effort to enhance online safety also involves rolling out notifications encouraging teens to review and adjust their privacy settings. These prompts will guide them to activate recommended settings that offer additional protections, such as restricting content reposting, managing tags and mentions, filtering offensive comments, and controlling who can send them messages.

This policy revision comes as Meta faces increasing scrutiny over its impact on the safety and mental health of young users. In response to these concerns, Meta CEO Mark Zuckerberg is scheduled to appear before the Senate Judiciary Committee later this month, alongside the CEOs of TikTok, Discord, Snap, and X (formerly Twitter), to discuss children’s safety on these platforms.


Keep up to date with our latest videos, news and content