A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • Gabu@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    Minor complaint: try to get an empty paragraph between the spoiled text and the non-spoiled text whenever possible - makes it easier to read.

    Regarding the discussion, you’re both right at the end of the day. Limiting exposure to illegal and immoral-adjacent material is obviously in society’s interest, but at the same time the implication that a glorified ad for a mental illness helpline is a good solution is ludicrous - it’s at the absolute bottom of the barrel when it comes to the kinds of issues we should be working on.

    • archomrade [he/him]@midwest.social
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      4 months ago

      I’m actually not sure how to get another linebreak in there, I tried adding multiple but the markdown kept collapsing it.

      And I happen to agree that a ‘surgeon general’s warning for CSAM’ isn’t doing much to solve the problem, but I do think alerting users to the adjacency is a good idea. OP seemed to be complaining about censorship, but this practice didn’t even amount to that. The article is mostly just marketing slop.