‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • Pyr_Pressure@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    11 months ago

    Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      edit-2
      11 months ago

      Whoooooosh.

      In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.

      • mossy_@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        11 months ago

        so because it’s not a problem in your culture it’s not a problem?

          • mossy_@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            11 months ago

            You caught me, I’m an evil villain who preys on innocent lemmings for no reason at all