• RickRussell_CA@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I suppose the only thing I disagree with is that the law can do anything about it. Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It’s the wild west.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 month ago

      You can’t ban the tech but you can ban the act so it’s easier to prosecute people that upload deep fakes of their co-workers.

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        That’s already illegal in most countries, regardless of how it was made. It also has nothing to do with “AI”.

        • Grimy@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Obviously, you can go after sites that have money and/or a real business presence, a la Pornhub. But the rest? It’s the wild west.

          I was referring to that part of his comment. It is also not at all illegal in most countries. Its only illegal at state level in the US for example, and not for all of them either. Canada only has 8 provinces with legislation against it.

          I do agree thoough that it’s not the softwares fault. Bad actors should be punished and nothing more.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      I think it’s best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.

      You can’t target the technology, or stop people from using AI to do perverted things, but if they get caught, we should at least respond to the problem.

      I don’t know what a proactive response to this issue looks like. Maybe better public education and a culture that encourages more respect for others?

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        I think it’s best that it be illegal so that we can at least have a reactive response to the problem. If someone abuses someone else by creating simulated pornography (by any means), we should have a crime to charge them with.

        So… Where do you draw the line exactly? Does this include classic photo manipulation too? Written stories (fanfic)? Sketching / doodling of some nude figure with a name pointed towards it? Dirty thoughts that someone has about someone else? I find this response highly questionable and authoritarian. Calling it abuse is also really trivializing actual abuse, which I, as an abuse victim, find pretty apprehensive. If I could swap what was done to me with someone making porn of “me” and getting their rocks off of that then I’d gladly make that exchange.

      • AnAnonymous@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 month ago

        Making things illegal doesn’t solve anything in the WWW, principally cos there isn’t a world’s jurisdiction and cos illegality doesn’t stop criminals, see what happened to the war on drugs… just a big failure.

        Maybe attacking the problem from the root like educating people to avoid porn at all could be successful at some point but anyway this it’s definitely another problem of the hiperconsumist capitalist scheme.

        Edit: I believe if it were illegal even the price of it will go up so it will be a bigger business at the end of the day.

        • PoliticalAgitator@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Making things illegal absolutely stops criminals. It doesn’t stop all criminals, but that’s never been the expectation. If you want to dismiss laws on the basis of not being 100% effective, there’s not a single law you support.

          • blargerer@kbin.social
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Yeah… I don’t think there is actually good proof that something being illegal actually stops a meaningful number of things on its own. There are plenty of studies that people do things that are socially frowned upon less IN FRONT OF OTHER PEOPLE (say littering for example), but very weak evidence it stops such activity in any meaningful way in private settings. Likewise there is plenty of evidence that other forms of punishment (which is to say, no immediate social stigma) actually don’t really correlate with reduced activity at all.

            • PoliticalAgitator@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              1 month ago

              I’m not under any obligation to prove it to you until you supply the “evidence” you’re mentioning but in the domain of rhetoric, “laws don’t dissuade criminals” people sounds dumb as fuck.

              I’ve witnessed laws – which frequently go hand in hand with your “socially frowned upon” – change peoples behaviour, from drink driving to your own littering example.

              Every crime is a calculation of risk and reward. The internet makes things lower risk, but there are absolutely laws that work. They’re why the web isn’t riddled with child pornography and why online drug marketplaces have to exist behind 10 layers of bullshit.

              • SupraMario@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                1 month ago

                https://www.ncbi.nlm.nih.gov/books/NBK217455/

                The Road Safety Act had a dramatic impact on Britain’s drivers. In the three months after it took effect, traffic fatalities dropped 23 percent in Britain. In the first year of the act, the percentage of drivers killed who were legally drunk dropped from 27 percent to 17 percent.

                These general trends mask several specific changes in British drinking practices. Research showed that the act did not significantly change the amount people in Britain drank. Rather, the act seems to have affected a very narrow slice of behavior—the custom of driving to and from pubs, especially on weekend nights. After the act took effect, many regular customers took to walking to pubs. Pub owners raised a considerable outcry, and a number of less conveniently located pubs closed.

                Unfortunately, the successes of the act were relatively short-lived. Within a few years, traffic fatalities again began to climb. By 1973 the percentage of drivers killed who were drunk was back to its pre-1967 level. By 1975, for reasons still unknown, this percentage had risen to 36 percent, considerably above what it was before the act.

                Research has also shown that efforts to impose tougher penalties in America have not had much effect. In part, this seems to be caused by people’s belief that “it can’t happen to me.” “After all,” Reed observes, “those who currently drive drunk are not deterred by the small risk of a very severe penalty—accidental death.”

                People gonna people.

        • ji59@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          I believe it it were illegal, it wouldn’t be so popular and people would have to hide it more. Illegality would bring more barriers to use it and since people are lazy less poeple would be interested.

          But personally I think the solution is for people to stop being so sensitive to nudity. If someone would post naked pictures of me I wouldn’t be happy, but either devastated. And if it were AI generated I could simply avoid it by saying that ain’t me.

          • celeste@kbin.earth
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            I agree that part of a way to deal with it socially is to not, like, ruin peoples lives when there are nude or pornographic images of them out there. When you lose your job because your ex posted a sex tape with you online and attached your name, and now you’re struggling to keep your house - that’s a devastating consequence, regardless of how someone personally feels about porn of them being online.

            I think we should all be like “it could even just be AI” to our more conservative acquaintances when they’re worked up because someone posted gay porn of a local teacher in their group chat.

          • Todd Bonzalez@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            If someone would post naked pictures of me I wouldn’t be happy, but either devastated. And if it were AI generated I could simply avoid it by saying that ain’t me.

            How to tell everyone you are male, without saying it directly.

  • Player2@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    Anyone could run it on their own computer these days, fully local. What could the government do about that even if they wanted to?

    • jeffw@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      Anyone can make CSAM in their basement, what could the government do about that even if they wanted to?

      Anyone can buy a pet from a pet store, take it home and abuse it, why is animal abuse even illegal?

      Should I keep going with more examples?

      • Player2@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        What do you want them to do, have constant monitoring on your computer to see what applications you open? Flag suspicious GPU or power usage and send police to knock on your door? Abusing people or animals requires real outside involvement. You are equating something that a computer generates with real life, while they have nothing to do with each other.

        • jeffw@lemmy.worldOP
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          1 month ago

          Who is suggesting that?

          Murder is illegal, do we surveil everyone who owns a gun or knife?

          CSAM is illegal, do cameras all report to the government?

          Again, that’s just 2 examples. Lmk if you want more

          • Player2@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Maybe my wording is unclear. I am wondering how they should be expected to detect it in the first place. Murder leaves a body. Abuse leaves a victim. Generating files on a computer? Nothing of the sort, unless it is shared online. What would a new regulation achieve that isn’t already covered under the illegality of ‘revenge porn?’ Furthermore, how can they possibly even detect anything beyond that without massive privacy breaches as I wrote before?

    • Carrolade@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      The govt’s job is not to prevent crime from happening, that’s dystopian-tier stuff. Their job is to determine what the law is, and apply consequences to people after they are caught breaking it.

      The job of preventing crime from happening in the first place mainly belongs to lower-level community institutions, starting with parents and teachers.

    • foggy@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      Pandoras box has already been cracked way open. Shit is already in military application.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      The issue is not with all forms of pornographic AI, but more about deepfakes and nudifying apps that create nonconsensual pornography of real people. It is those people’s consent that is being violated.

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        I still don’t understand why this is now an issue but decades of photo editing did not bother anyone at all.

        • CarbonIceDragon@pawb.social
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          I mean, it did bother people, it just took more skill and time at using photo manipulation software to make it look convincing such that it was rare for someone to both have the expertise and be willing to put in the time, so it didnt come up often enough to be a point of discussion. The AI just makes it quick and easy enough to become more common.

            • Todd Bonzalez@lemm.ee
              link
              fedilink
              arrow-up
              0
              ·
              1 month ago

              It literally is a one-click solution. People are running nudifying sites that use CLIP, GroundingDINO, SegmentAnything, and Stable Diffusion to autonomously nudify people’s pictures.

              These sites (which I won’t even mention the names of), just ask for a decent quality photo of a woman wearing a crop top or bikini for best results.

              The people who have the know-how to set up Stable Diffusion and all these other AI photomanipulation tools are using those skills to monetize sexual exploitation services. They’re making it so you don’t need to know what you’re doing to participate.

              And sites like Instagram, which are filled with millions of exploitable images of women and girls, has allowed these perverted services to advertise their warez to their users.

              It is now many orders of magnitude easier than it ever has been in history to sexually exploit people’s photographs. That’s a big deal.

              • DarkThoughts@fedia.io
                link
                fedilink
                arrow-up
                0
                ·
                1 month ago

                If you wanna pay for that then you do you. lol But at that point you could’ve also paid a shady artist to do the work for you too.

                Also, maybe don’t pose half naked on the internet already if you don’t want people to see you in a sexual way. That’s just weird, just like this whole IG attention whoring of people nowadays. And no, this isn’t even just a women thing. Just look how thirsty women get under the images of good looking dudes that pose topless, or just your ordinary celeb doing ordinary things (Pedro Pascal = daddy, and yes, that includes more explicit comments too).

                This hypocritical fake outrage is just embarrassing.

                • Todd Bonzalez@lemm.ee
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  1 month ago

                  If you wanna pay for that then you do you.

                  These services don’t even cost anything, they’re just loaded with ads. Do you understand how the Internet works?

                  Also, maybe don’t pose half naked on the internet already if you don’t want people to see you in a sexual way.

                  And straight to victim blaming, on an issue that affects women orders of magnitude more than men. You go straight to implying consent for what they’re wearing and calling them whores for daring to have sexual agency.

                  Women can pose in whatever clothes they want online, that doesn’t give you the right to sexually violate them. You have a rapist mindset…

                  Go fuck yourself, you misogynistic piece of shit.

      • quindraco@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        No-one cares if you consent to being drawn. The problem here isn’t the consent of the depicted person, it’s that the viewer is being misled. That’s why the moral quandary goes away entirely if the AI porn is clearly labeled as such.

        • CarbonIceDragon@pawb.social
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          1 month ago

          I dont think that this is really true, I strongly suspect that most people I know would consider someone drawing porn of them without consent a majorly icky thing to do, and would probably consider someone doing that to someone else to be a creep for doing so. The reason such drawings are less an issue is at least partly that the barrier to entry is lower with AI, since it takes a certain amount of skill and time investment to draw something like that such as to be clearly recognizable as any specific real person.

  • Veraxus@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.

    Anything other than that narrow application is an infringement on the First Amendment.

      • Veraxus@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        Exactly. Photoshop has been around for decades. AI is just more of the same. I find it weird how, as technology evolves, people keep fixating on the technologies themselves rather than the universal (sometimes institutional) patterns of abuse.

    • Admiral Patrick@dubvee.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I feel an easy and rational solution is to criminalize a certain category of defamation… presenting something untrue/fabricated as true/real or in a way that it could be construed as true/real.

      I would love that solution, but it definitely wouldn’t have bipartisan support.

      • Veraxus@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        There are certain political groups that have a vested interest in lying, deceiving, manipulating, and fabricating to get what they want. So… yeah. 😞

        • maynarkh@feddit.nl
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          I feel that’s just most political groups nowadays. Not implying both sides are the same, just that everyone likes their lies.

      • Veraxus@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 month ago

        It’s not, though. Not remotely. At least not in the US.

        Defamation of an individual (including “individual” entities like an org or business) is purely a civil matter, and defamation in a broader sense, such as against “antifa” or “leftists” or “jews” or “gays” et al, has no remedy whatsoever, civil or criminal.