Meta to block harmful content for teen users

Teen accounts will less encounter harmful content, such as graphic violence, eating disorders and self-harm
The image displays Meta logo. — Pixabay
The image displays Meta logo. — Pixabay

Meta has announced to imposition of restrictions displayed on the Facebook and Instagram accounts of teenagers amid rising pressure from regulators claiming that the platform is harmful to young users.

According to the company, teen accounts will be less likely to encounter content containing any sort of harm, such as graphic violence, eating disorders and self-harm.

These changes involve the prevention of certain content from appearing in the Feed and Stories, even if it is posted by a teenager’s friend. The social media giant stated that these updates will be implemented across all teenage accounts in the upcoming weeks.

Read more: Microsoft, OpenAI sued over AI training data

Meta is also set to testify before the Senate on child safety on January 31, alongside X (formerly Twitter), TikTok, Snap, and Discord.

“We regularly consult with experts in adolescent development, psychology and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens,” the company wrote in a blog post.

Meta has stated that it currently conceals search results related to suicide and self-harm, and it is now expanding this protective measure to encompass additional terms. This setting is automatically activated for new teenagers joining the platforms. However, it will also be extended to teenagers who are already using the applications.

Referred to as “Sensitive Content Control” on Instagram and “Reduce” on Facebook, the content recommendation controls are designed to make it harder for users to come across potentially sensitive content or accounts in places like Search and Explore.

Using notifications, Meta will prompt teenagers to update their app settings for a more private experience. These notifications will appear when a teenager interacts with an account that is not on their friend's list.

The significant development comes after legal actions from over 40 states were taken, suing and accusing the company of having “profoundly altered the psychological and social realities of a generation of young Americans” and using “powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens.”