A recently published internal study by Meta ( Instagram's parent company) shows that the platform is accidentally displaying a lot of content related to negative eating and body disorders for the most vulnerable group of adolescents.
The results were revealed from documents accessed by Reuters, raising new concerns about the psychological impact of social networks on young people.
According to the study, Meta surveyed 1,149 adolescents in the 2023-2024 school year about their feelings for their bodies after using Instagram.
The research team then manually tracked the displayed content with these people for three months.
The results showed that in the group of 223 adolescents who were often self-conscious about their appearance, posts related to eating disorders accounted for 10.5% of the total content they saw, more than three times higher than the remaining group (3.3%).
These posts often display prominent body parts such as the chest, buttocks, thighs; contain body judgment language or describe extreme eating behaviors.
Although it does not violate content policy, according to experts, the dense appearance of this type of content can negatively affect the psychology and body awareness of young users.
In addition, the group of adolescents with the most negative feelings about themselves are also exposed to more provinceenting or harmful content, including topics for adults, dangerous behavior, causing pain, accounting for 27% of display content, nearly double that of the neutral group.
Meta researchers admit that this result does not prove that Instagram makes users feel worse, but shows a clear link between displayed content and negative emotions about the body.
They also note the possibility of these teens actively searching for related content, causing algorithms to continue to suggest more.
Another notable part of the report is the limitations of the current censorship system. Meta's screening tools detected only 1.5% of the company's sensitive content that it said was not suitable for adolescents. Meta said it is developing new algorithms to address the issue.
Meta representative Andy Stone affirmed that the research results show the company's efforts to better understand the young user experience and build a safer platform.
Mr. Stone said Meta is testing a policy of displaying content for adolescents according to PG-13 standards, equivalent to the limit for cinema for ages 13 and older.
Although Meta has affirmed its commitment to improvement, this study once again raises questions about the responsibility of social media platforms in protecting the mental health of young users, who are the most vulnerable group to the power of algorithms and body image on the network.