Police in 20 countries have taken down a network that distributed images of child sexual abuse entirely generated by artificial intelligence.

The operation – which spanned European countries including the United Kingdom as well as Canada, Australia, New Zealand – is “one of the first cases” involving AI-generated images of child sexual abuse material, Europe’s law enforcement agency Europol, which supported the action, said in a press release.

Danish authorities led an operation, which resulted in 25 arrests, 33 house searches and 173 devices being seized.

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Theoretically you should be able to generate it by cobbling together legal images.

    But given the massive volume of scraped data, they’ve also ended up with actual CSAM in their training data. I recall seeing articles about them having to identify and remove it, but I’m definitely not adding that search to my history.

    • Q*Bert Reynolds@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      I’m not even talking about the accidentally scraped images. People retrain models to make porn that more accurately depicts their fetish all the time.

    • Bob Robertson IX
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      I find the legal aspect of it fascinating because proving the age of a computer generated boob is impossible unless you have access to the actual prompt used to create it, and that prompt specifically asks for an image of CSAM. And even then, who’s to say the system didn’t use create most of the image from legal training data, and just had underage facial features. It’ll be interesting to see how they prosecute and defend against these charges.