Breaking News

Facebook and Instagram’s algorithms facilitated child sexual harassment, state lawsuit claims

Unredacted Meta Presentation Reveals Internal Estimates of 100,000 Daily Child User Harassment”

 In a significant development, an unredacted internal presentation from Meta has come to light, indicating that the company’s own employees estimated that 100,000 child users were harassed daily on Facebook and Instagram. The revelation adds fuel to the ongoing legal challenges Meta is facing, particularly the lawsuit filed by the state of New Mexico, which accuses Meta of failing to protect children and alleges that algorithms recommended inappropriate content to minors. The internal document sheds light on concerns raised by Meta employees regarding the impact of algorithms, especially the “People You May Know” feature, on connecting children to potentially harmful individuals.

 

  1. People You May Know (PYMK) Algorithm Concerns:

    • The internal document singles out Facebook’s “People You May Know” (PYMK) algorithm as a major contributor to connecting children with potential predators. Employees reported that this algorithm was responsible for 75% of all inappropriate adult-minor contacts.
  2. Rejected Recommendations for Algorithm Redesign:

    • Meta employees reportedly raised concerns and recommended a redesign of the PYMK algorithm to prevent recommending adults to minors. However, Meta executives allegedly rejected these recommendations. Employees expressed frustration over the refusal to address the algorithm’s role in facilitating inappropriate interactions.
  3. Instagram’s Disturbing Issues:

    • The internal memo highlights Instagram as a platform where issues were particularly insidious. A 2020 memo revealed that “sex talk” was 38 times more prevalent on Instagram than on Facebook Messenger in the U.S. The platform faced criticism for facilitating inappropriate interactions, including cases where minors were solicited.
  4. Legal Action by New Mexico and Other States:

    • The state of New Mexico filed a lawsuit against Meta, alleging that the company failed to address large-scale predation on its platform, particularly concerning recommendation algorithms. The lawsuit claims that Meta knew about the issues but did not take sufficient action to protect users, leading to instances of child exploitation.
  5. Late Actions and Ongoing Lawsuits:

    • New Mexico alleges that Meta only took action to limit adult predation on minors in late 2022, and the measures fell short of the recommendations made by safety staff. Meta is facing multiple lawsuits from states, with 41 states raising concerns about the impact on the mental health of young users.
  6. Meta’s Recent Measures for Teen Users:

    • Meta recently introduced measures for teen users on Instagram and Facebook, aiming to enhance safety. These measures include restricting non-followers from messaging teens and blocking offensive comments.
  7. Deceptive Practices Allegations:

    • Another unsealed complaint by 33 states alleges that Meta “coveted and pursued” users under the age of 13 and engaged in deceptive practices regarding the handling of underage users’ accounts.

 The unredacted Meta presentation provides a deeper understanding of the internal concerns raised by employees regarding the potential harm caused to child users on Facebook and Instagram. With ongoing legal challenges and increased scrutiny, Meta faces intensified pressure to address these issues, implement robust safety measures, and respond to allegations of deceptive practices in handling underage users. The revelations from the internal document are likely to impact the ongoing legal battles and further shape discussions around online safety, especially for younger users.

Facebook and Instagram's algorithms facilitated child sexual harassment, state lawsuit claims

Leave a Reply

Your email address will not be published. Required fields are marked *