A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • laughterlaughter@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    10 months ago

    It’s not really that underwhelming. Disclaimer: I don’t condone child abuse. I find it abhorrent, and I will never justify it.

    People have fantasies, though. If a dude searches for “burglar breaks in and has sex with milf,” does that mean that he wants to do this in real life? Of course not (or god I hope not!) So, some people may have searched for “dad has sex with young babysitter” and bam! Bot! Some people have a fetish for diapers - there are tons of porn of adults wearing diapers and having sex. Not my thing, but who am I to judge? So again, someone searches “sex with diapers” and bam! Bot!

    Let’s not forget that as much as pornhub displays a sign saying “Hey, are you 18?” a lot of people will lie. And those young folks will also search for stupid things.

    So I don’t think that aaaaaall 1+ million searches were done by people with actual pedophilia.

    The fact that 1,600 people decided to click and inform themselves, in the UK alone, well, that’s a lot, in my opinion, and it should be something to commend, not to just say “eh. Underwhelming.”