Meta will restrict content related to suicide, self-harm and eating disorders from teen users as part of an update to youth safety and privacy policies, the company announced Tuesday.
The content restriction update expands on Meta’s policy that barred recommendations of content about suicide, self-harm and eating disorders in teen users’ reels and explore pages. Content about those topics will now be hidden for teens to view in their feeds and stories, even when it is shared by an account the user follows, according to Meta’s blog post.
If a teen user searches for terms related to restricted topics, they will be directed to expert resources for help.
As part of the update, Meta will also be placing all teen users into its more restrictive content control settings on Instagram and Facebook, according to the announcement. The setting was already in place for new teen users who joined the platforms and will now be expanded to teens already on the apps.
Meta will also roll out notifications with prompts that direct teens to update their privacy settings. Teen users will have the option to “turn on recommended settings,” which will automatically change their settings to restrict who can repost their content, tag or mention them, as well as to help them hide offensive comments and ensure only their followers can message them.
The update is Meta’s latest changes to settings for teen users after mounting scrutiny over how Meta and other tech giants are impacting children’s safety and mental health.
Later this month, Meta CEO Mark Zuckerberg is scheduled to testify before the Senate Judiciary Committee at a hearing on children’s safety along with the CEOs of TikTok, Discord, Snap and X, the platform formerly known as Twitter.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.