The whistleblower claims that Meta has not taken sufficient measures to protect children.

The whistleblower claims that Meta has not taken sufficient measures to protect children.

A whistleblower has stated that Meta, run by Mark Zuckerberg, has not taken sufficient measures to protect children following the death of Molly Russell. They claim that the company already has the necessary systems in place to prevent teenagers from being exposed to harmful content on social media.

Arturo Béjar, a previous high-ranking engineer and advisor for the company that owns Instagram and Facebook, stated that if they had taken note of the lessons from Molly’s passing and the following investigation, they could have made the platform safer for young users. Based on Béjar’s study of Instagram users, 8.4% of 13 to 15-year-olds reported witnessing self-harm or threats of self-harm within the past week.

Béjar told the Guardian that if they had taken heed of the impact of Molly Russell’s experience, they would have developed a product that is suitable for 13-15-year-olds. This product should ensure that only one out of twelve young users in the past week are exposed to harmful or threatening content related to self-harm. Additionally, the majority of these users should feel supported when encountering such content.

In 2017, a young girl named Russell from Harrow, a suburb of north-west London, tragically ended her own life. It was determined in a significant court case in 2022 that she had been exposed to harmful material on Instagram and Pinterest related to suicide, self-harm, depression, and anxiety. The inquest into her death concluded that Molly had taken her own life due to self-harm while battling depression and the adverse impact of online content.

According to Béjar, Zuckerberg possesses the necessary resources to enhance the safety of Instagram, specifically for teenagers. However, the company has decided not to implement these alterations.

Either a new CEO is required or the current one must make the decision to prohibit this type of content on the platform. The necessary infrastructure and tools are already in place to prevent this content from being accessible.

Béjar conducted a study at Instagram and made efforts to prompt the company to take action based on the findings. This is now part of a legal case filed by Raúl Torrez, the attorney general of New Mexico, against Meta. The suit alleges that Meta is not doing enough to protect children from sexual abuse, predatory behavior, and human trafficking. Unredacted papers from the lawsuit reveal that Meta employees cautioned against maintaining the current state of affairs following Molly’s death, as it is deemed unacceptable by the media, affected families, and the general public.

Béjar was an engineering director who oversaw the development of child safety measures and strategies for managing harmful content, such as bullying, on various platforms. After leaving his role as a senior engineer in 2015, he returned as a consultant in 2019 for a two-year period. During this time, he conducted research that revealed alarming statistics, including that one in eight children aged 13 to 15 on Instagram had experienced unwanted sexual advances, one in five had been bullied, and 8% had viewed self-harm content.

The ex-Meta staff member has urged the company to establish objectives for decreasing damaging content. “This will create an incentive for them to address these issues in the long run,” he stated.

Bejar has advised Meta to implement several modifications, such as simplifying the process for users to report undesirable content and explaining their reasons for not wanting to see it, regularly gathering feedback from users on their interactions with Meta platforms, and streamlining the procedure for users to file reports on their experiences with Meta services.

Béjar is actively observing the Instagram platform and has noticed that harmful material, such as content related to self-harm, still exists on the app. There is also clear proof of underage users, even though Instagram has a minimum age requirement of 13 years old.

Béjar has been engaging with UK politicians, regulators, and activists this week, including Ian Russell, the father of Molly whose organization, the Molly Rose Foundation, helped arrange his visit. Béjar appeared before Congress last year to share his firsthand account of his time at the company and the negative encounters his teenage daughter and her friends faced on Instagram, such as unwelcome sexual advances and harassment.

According to Béjar, it would require three months for Meta to effectively address self-harm content. He emphasized that they have the necessary tools to do so, but it ultimately comes down to a decision to prioritize creating a safe environment for teenagers and publicly monitoring and reporting on these efforts.

A representative from Meta stated that a dedicated team, both within and outside of the company, is actively working on ways to promote online safety for young individuals. Collaborating with parents and experts, they have implemented more than 30 tools and materials to assist teenagers and their families in having a secure and enjoyable online presence. This effort is ongoing.

Meta has implemented various safety measures, such as automatically setting the accounts of users under 16 years old to private mode upon joining Instagram. They have also limited adults from sending private messages to teenagers who do not follow them and have provided a feature for Instagram users to report instances of bullying, harassment, and sexual activity.

Source: theguardian.com