shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • raccoona_nongrata@beehaw.org
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here. The philosophical discussion of whether it’s “actually them” or not doesn’t really matter, it’s still intrusive, violating and gross. In the same way that stealing someone’s identity is illegal, it doesn’t matter that the identity created by the con man isn’t the real you, damage can be done with identity theft.

    Maybe there’s nothing you can do about it on the dark web, but sites absolutely can manage deepdakes, in the same way that PornHub will take down non-consensual ex-girlfriend type content.

    The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.

    • Legally speaking, you do have some ownership of your own appearance already (at least you do where I live, but I also think most American states have personality rights), so I’m pretty sure spreading deepfakes of someone is illegal already. If not, it’s time for those rights to be brought to your state, and not just because of deepfakes.

      From a legal point of view, I don’t think we need to take any extra steps specifically against pictures generated by AI. Like you said, stealing someone’s identity is illegal, it doesn’t matter that the identity created by the con man isn’t the real you. Fake pictures of anyone should be banned, although I’m not sure how that will work with the freedom of speech absolutism present in some countries.

      We do need legal guidance covering the problems with all current AI developments, like answering the “does a generated model contain its source material” question. Deepfakes are covered by the same law banning people from spreading photoshopped pictures of you online.

      I agree that porn sites should require verification but that won’t fix the “we’ll send images to all of your friends on social media” problem that’s much more pressing. Furthermore, how do you validate someone when AI can be used to create any picture you can think of? Visit the Pornhub office and have your irises scanned? Drivers’ licenses are not that hard to fake and many countries still lack any decent form of digital identity verification in the year of our lord twenty-twenty-currentyear.

      Celebrities can probably take down the published models of their faces on hosting sites, but that won’t stop anyone from recreating those models on their own computers and perhaps spreading them through Discord servers and whatnot. You also run into the legal challenge of “is this meant to generate porn or is it general purpose” and “do I actually own the rights to take down these models”. Photographs taken of you are not your intellectual property, and in many cases you have no say in what others do with them, unless they’re actually doing something illegal with the pictures. If you ask me to take a picture of you, you don’t get copyright or ownership of that picture, not even if you pay me for it unless the transfer of copyright is spelled out in a contract.

      Fixing these issues will require altering some core concepts in laws all around the world. None of these problems are novel or impossible to solve, but they require rethinking the concept of personality rights. We’ll have to find new boundaries to stick to (are professional impressionists illegal? what about porn lookalikes? what about sexy cosplay? what about pencil drawings? what about non-celebrity stuff, like photoshopping people into wedding photographs because they couldn’t be there at the time?), regardless of technological advancements.

      In my opinion, deepfake technology should be treated like we treat Photoshop. It’s a pretty neat tool that is already revolutionary for everything from Instagram filters to movie productions, with some damn dangerous implications if we don’t learn to live with its existence. I can almost guarantee that within a few years Photoshop will add a deepfake button to its UI to insert yourself or someone else into pictures or fully AI generated scenarios; it’s just the next logical step now that it has an image generation feature.

    • davehtaylor@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.

      Exactly. In another thread on here recently someone said something that basically boiled down to “your protest against AI isn’t going to stop it. There’s too much corporate power behind it. So you might as well embrace it” and I just cannot get my head around that mentality.

      Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It’s creepy as hell.

        • davehtaylor@beehaw.org
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          I never said it was. But like the person I was replying to said: we need to take a good hard look at what the hell these tools are doing and allowing and decide as a society if we’re going to tolerate it.

          The real issue here is what things like deepfakes can do. It’s already starting, and it’s going to continue accelerating, generating mis- and disinformation: for private citizens, celebs, and politicians. While you might say “it’s creepy, but there’s nothing we can do about people deepfaking Nancy Pelosi’s face onto their spank material”, it’s extremely problematic when someone decides to make a video where Joe Biden admits to running a CP ring, or some right wing chud makes a video of Trump appearing to say something they all want to hear, and it leads to a civil war. That’s the real stakes here. How we react to what’s happening with regular folk and celebs is just the canary int he coal mine.

    • lloram239@feddit.de
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here.

      It’s the only sensible answer. Anything else would require an extreme violation of everybody’s privacy and the implementation of total surveillance. See France’s recent attempt at giving police full access to peoples phones, that’s the kind of stuff you end with when going down that route.

      This AI is out there today, can be run on every half descent gaming PC and can generate new images in about 30sec. And it will only get better going forward. Images are as malleable as text now, you can accept that, or keep trying to fight windmills.

      but sites absolutely can manage deepdakes

      Of course they can, and most already do. But on the whole, that really doesn’t have much of an effect, anybody can make their own sites and you don’t even have to go deep down into the dark web for that. It’s the first link on Google when you search for it.