THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • fine_sandy_bottom
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    I don’t think that’s what it means.

    A depiction which is authentic might refer to provenance.

    If someone authorises me to make a pornographic depiction of them, surely that’s not illegal. It’s authentic.

    So it’s not a question of whether the depiction appears to be AI generated, it’s really about whether a reasonable person would conclude that the image is a depiction of a specific person.

    That means tattoos, extra limbs, third books, et cetera won’t side step this law.