Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • fiah
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    that is not what this comment thread is about

    • nudny ekscentryk@szmer.info
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      1 year ago

      it very much is:

      OP: In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

      parent reply: Thats why we need Blockchain Technology

      • fiah
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        1 year ago

        a discussion can have multiple, separate threads with branching topics, that’s what this threaded comment system is specifically made to facilitate

        • nudny ekscentryk@szmer.info
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          1 year ago

          okay, let’s rethread how we got here:

          OP: Spanish girls report AI porn of them circulating

          parent comment: Blockchain could fix this

          1-st level reply: Blockchain can’t counteract fake porn being created

          2-nd level reply: it lets you verify original source

          3-rd level reply: if anything it lets you verify integrity between sources

          you: if a central authority can’t be trusted to verify sources then Blockchain can

          me: it’s not about verifying provenance of the material but rather its mere existence in the world

          you: we can store the fingerprint of the file in a trusted database

          me: but this doesn’t affect the material’s existence

          you: you’re going off-topic!

          me: I am not

          you: this conversation can have multiple threads

          can you now see how it’s you who’s off the rails in this conversation? noone ever questioned how blockchain could allow verifying any piece’s of media authenticity, but spreading forged, nonconsensual erotica is NOT about proving whether a photo or video in question is authentic; the problem is that people have got tools to do so in the first place, and before a victim can counteract and prove (using blockchain if you will) that a particular photo is a forgery, the damage is done regardless

          • fiah
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            1 year ago

            okay, let’s rethread how we got here:
            OP: Spanish girls report AI porn of them circulating
            parent comment: Blockchain could fix this

            you’re missing a step there, buddy. I know, it’s hard, let me make it a bit easier for you by drawing a picture:

            “blockchain can fix this” was never about preventing AI porn from being spread, it’s about the general problem of knowing whether something was authentic, hence their choice to reply to that comment with that article

              • papertowels@lemmy.one
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                1 year ago

                …you’re right, it has nothing to do with nudes because it’s talking about an entirely different problem of court-admissable evidence.

              • fiah
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                2
                ·
                1 year ago

                yes, you’re right, it doesn’t, because we weren’t talking about that. “blockchain” can’t do anything to help kids from having AI generated naked pictures of them being spread, and nobody here claimed otherwise