• Эшли Карамель
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    12
    ·
    11 months ago

    but then they wouldn’t have consented to the creation of porn of themselves. which if it is a deepfake, it is literally non consensual porn.

    • thantik@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      edit-2
      11 months ago

      Hate to break it to you, this is already legal. “Non Consensual Porn” only applies to photographs. Nobody should have to consent to everything like that.

      If I draw you standing under the eiffel tower, fully clothed - the legality shouldn’t change just because you don’t LIKE what’s being drawn.

      • Эшли Карамель
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        8
        ·
        11 months ago

        I’m aware it’s already legal, hence why action should be taken. plus videos are just a bunch of photos stitched together so I don’t see your point of it only applying to photos.

        • thantik@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          4
          ·
          edit-2
          11 months ago

          Because it being nude/etc is the only thing that is different from people just simply drawing others in art.

          Just because you don’t like pornography, shouldn’t change the legality of it. It’s prudism and puritanism at its finest.

      • Ð Greıt Þu̇mpkin@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        13
        ·
        11 months ago

        Nobody should have to consent to everything like that

        I’m sorry but holy fuck that is just morally bankrupt.

        Someone should have the ABSOLUTE right to control any distribution of their image when of a sexual nature that they didn’t actively consent to being out there

        Anything less is the facilitation of the culture of sexual abuse that lets the fappening or age of consent countdown clocks happen

        Drawing a picture of someone under the eifel tower is a wildly different act than drawing them in the nude without them knowing and agreeing with full knowledge of what you plan to do with that nude piece.

        • Fal@yiffit.net
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          4
          ·
          11 months ago

          Calling this sexual abuse is absolutely insulting and disgusting

          • Ð Greıt Þu̇mpkin@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            10
            ·
            11 months ago

            Trying to pretend it’s not is feeding the culture of not listening to victims.

            It’s like saying that cat calling is harmless, forcing people to be reminded they are seen as a sex object is well known and documented as a tool of keeping the victim “in their place.”

            It’s harassment, and when done at the scale famous folks experience for the crime of being well known and also attractive, basically amounts to a campaign of terror via sexual objectification.

            Nevermind how tolerating it makes space for even more focused acts of terror like doxxing and making threats of sexual assault.

            • thantik@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              2
              ·
              edit-2
              11 months ago

              Then you need to take a step back and look at your argument.

              Producing the work isn’t the problem here. Distributing it and harassing people with it is.

              So why don’t we just make distributing it as a form of harassment illegal instead? You deal with the specific thing that causes the problem, not the thing that it stems from broadly just because you don’t like nudity.

              But if I want to sit here and make AI pictures of women and whack off to them in my bunk, fake women who might incidentally look like some real woman – Nobody should be penalized because of that. You’re painting with broad strokes of a brush here, without thinking of the larger repercussions.

              What about twins? Who consents there? If one gives permission and the other doesn’t…then what? How do you handle edge cases like that? Because you’re trying now to put rules around something that’s awfully grey-area here.

              You lose all those weird edge cases once you attack the real problem: Harassing people with sexual images. It’s not the nudes that’s the problem, it’s the harassment.

              • Ð Greıt Þu̇mpkin@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                edit-2
                11 months ago

                Nah it’s the nudes, and you have more than enough free material on pornhub from consenting participants to not need to crusade about your god given right to spank bank Jennifer Lawrence over her objections to you getting to have nude images of her.

                As much as it enrages people who don’t touch grass, you’re not actually entitled to non-consentually get yourself fap material of people who don’t want the public having fap material of them, and insisting you are is pretty fuckin’ rapey actually. I’m sure you insist you’re a nice guy or nice girl too and can’t wonder why the pretty people won’t give you a shot over the jerks and hoes.

                • thantik@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  11 months ago

                  Happily Married for 20 years with 3 kids, but yeah…sure. Whatever you need to tell yourself to justify your confirmation bias. The rest of us who think logically have worked through this already. No need for some religious puritanist to tell us naked = bad.

            • Fal@yiffit.net
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              11 months ago

              Trying to pretend it’s not is feeding the culture of not listening to victims.

              No, it’s insulting to actual victims of actual events that happen in real life

              • Ð Greıt Þu̇mpkin@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                11 months ago

                Getting spammed with artificially generated nudes to “put you in your place” is real life, and jackasses like you are why victims hesitate to report this behavior that often escalates to credible threats of physical violence.

                • Fal@yiffit.net
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  11 months ago

                  Being spammed and harassed and threatened is a totally different thing that, like you said, is real life.

        • DreamerofDays@kbin.social
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          11 months ago

          I’m wondering if the degree of believability of the image has, or should have any bearing on the answer here. Like, if a third party who was unaware of the image’s provenance came across it, might they be likely to believe the image is authentic or authorized?

          For another angle, we allow protections on the usage of fictional characters/their images. Is it so wild to think that a real person might be worthy of the same protections?

          Ultimately, people are going to be privately freaky how they’re gonna be privately freaky. It mostly only ever becomes a problem when it stops being private. I shouldn’t have to see that a bunch of strangers made porn to look like me, and neither should Taylor. And mine are unlikely to make it into tabloids.

          • thantik@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            11 months ago

            From https://www.owe.com/resources/legalities/7-issues-regarding-use-someones-likeness/

            A. The short answer is no. Individuals do not have an absolute ownership right in their names or likenesses. But the law does give individuals certain rights of “privacy” and “publicity” which provide limited rights to control how your name, likeness, or other identifying information is used under certain circumstances.

            From that page, it actually looks like there is a very specific criteria for this - and Taylor Swift HERSELF is protected because she is a celebrity.

            However, there are still a lot of gotchas. So instead of making the product/art itself illegal, using it as harassment should be what’s illegal. Attaching someone’s name to it in an attempt to defame them is what’s already illegal here.

    • intensely_human@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      11 months ago

      Having an image exist somewhere of them isn’t the sort of thing a person should have to consent to.

      Consent is for things that affect that person.