A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • echo64@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    37
    ·
    8 months ago

    Every time this comes up, all the tech nerds here like to excuse it as fine and not a bad thing at all. I am hoping this won’t happen this time, but knowing lemmys audience…

    • sbv@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      8
      ·
      8 months ago

      The Lemmy circlejerk is real, but excusing deep fake porn is pretty off brand for us. I’m glad the comments on this post are uniformly negative.

    • roscoe@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      8 months ago

      I think part of the difficulty discussing this is the discussions usually combine two different things. The production and distribution.

      I was informed elsewhere in this thread people can already produce these images/videos on their own machines with no third parties involved or remote processing. I can’t think of a single thing that can be done about that so acceptance is all we’ve got.

      Nonconsensual sharing, on the other hand, we can and should do something about. The legal system won’t be able to stop it altogether but it can push it to the fringes and stop it from becoming mainstream so any victims wouldn’t see fake images/videos of themselves proliferating everywhere.

    • cley_faye@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      7 months ago

      It’s not a matter of excusing it. Distribution of someone’s picture without their explicit consent, and anything like that, is inexcusable. But we’re talking about the generation of said content, which technically can’t be stopped without seriously restraining everything.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      11
      ·
      8 months ago

      I’m not saying it’s not a bad thing but it’s inevitable. The problem will just be getting worse and there’s no stopping it. It’s something we’re just going to need to accept as a new normal. If we can deal with living under the constant threat of nuclear armageddon then I think we can live with fake nudes aswell.

      • echo64@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        4
        ·
        8 months ago

        Yeah it’s this shit I’m talking about. We have a whole legal and justice system to deal with this. No kne needs to accept sexual abuse as a new normal. This shit is weird.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          4
          ·
          edit-2
          8 months ago

          I’m not saying there shouldn’t be consequences for someone who is spreading these pictures with the intention to cause harm to someone’s reputation but it’s incredibly naive to think that the justice system is going to stop deepfakes when it can’t even prevent bike theft. 12 year olds are making these with their smartphones. The technology is extremely accessible and easy to use and that is not going to change. I’m sorry but you’re not putting the toothpaste back into the tube. Wait a few years and you can generate photorealistic porn videos of anyone you want.

          • echo64@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            6
            ·
            7 months ago

            We can’t stop biketheft so fuck off women, your free game coz this guy said so.

              • echo64@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                4
                ·
                7 months ago

                You might want to look up what strawmanning means. I’m just flat out mocking what you said.

        • Dkarma@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          8 months ago

          No we don’t. What is happening here is not covered by current laws.

        • 0x0@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          10
          ·
          edit-2
          8 months ago

          Sexual abuse?

          Child pornography involves molesting a child and is a crime, as it should be.

          Fake nudes have been a thing for ages and are only an issue if the targeted party takes offense. It may be slander but it’s certainly not sexual abuse.

          No one is accepting sexual abuse so drop it down a notch, Karen.

          • sbv@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            edit-2
            8 months ago

            only an issue if the targeted party takes offense.

            Deep fakes can change how the victim is treated by other people. Especially other kids.

            Upthread, someone states

            we’re just going to need to accept as a new normal.

            Which sounds a lot like accepting this kind of shit, regardless of what you call it.

          • eatthecake@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            7 months ago

            From another comment:

            To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

            Try to imagine watching a realistic video of yourself being abused, imagine your mother watching. That will absolutely fuck some people up, and a lot of those victims are going to be children. Shit is going to get bad.

            • 0x0@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 months ago

              I wouldn’t put actual non-consensual pornography and fake pornography of any kind in the same bag but, geez, I’m not a Dr.

              Deep fakes do improve on the (technical) realism over 90s photoshop for sure. Doesn’t that still qualify as slander? (Also not a lawyer.)

              • eatthecake@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                The question is why, with an internet full of porn, do men want non consensual pornography that they know women are opposed to. It’s as if the hurtfulness, the lack of consent and the control over the woman in the video are actually the point.

                • 0x0@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 months ago

                  Non-consensual pornography is called rape and it’s a crime in most of the world.