Meta announced on Tuesday that it will hide certain content for teens from Instagram and Facebook. This includes content that discusses suicide, self-harm, and eating disorders.
Meta is creating more “age-appropriate” content for its users. Sensitive topics will no longer appear in their timeline, even if someone follows them.
Meta published a blog recently in which they described their three new initiatives.
Details: “We will start hiding more types on Instagram and Facebook based on expert advice. We’re putting all teens on the most restrictive content control settings on Instagram, Facebook, and Search on Instagram. We’re sending teens a notification to encourage them to upgrade their Instagram privacy settings.”
In the post, it was stated, “We want teens to enjoy a safe and age-appropriate app experience. ”
Meta is yet to make it clear what they intend to do about this issue.
Meta said that a story about a person’s ongoing struggle with suicidal thoughts can be powerful and help to de-stigmatize these issues. It is complex and not for everyone.
We will remove this type of content and other inappropriate material from Instagram and Facebook for teens.
Recent updates to policy coincide with lawsuits filed by various U.S. states accusing social media giants of designing features to encourage children to become addicted. It is claimed that the social media platform has contributed to the mental health crisis in America.
Meta said that to ensure teens check their privacy settings and are aware of the more private options, they are now sending notifications encouraging them to update. ”
“If teens choose to ‘Turn on Recommended Settings’, we will automatically change the settings to ensure that only followers of theirs can message them. “