A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • roscoeEnglish
    arrow-up
    14
    arrow-down
    3
    ·
    7 months ago
    link
    fedilink

    As soon as anyone can do this on their own machine with no third parties involved all laws and other measures being discussed will be moot.

    We can punish nonconsensual sharing but that’s about it.

    • CeeBeeEnglish
      arrow-up
      37
      arrow-down
      0
      ·
      7 months ago
      link
      fedilink

      As soon as anyone can do this on their own machine with no third parties involved

      We’ve been there for a while now

      • roscoeEnglish
        arrow-up
        7
        arrow-down
        1
        ·
        7 months ago
        link
        fedilink

        Some people can, I wouldn’t even know where to start. And is the photo/video generator completely on home machines without any processing being done remotely already?

        I’m thinking about a future where simple tools are available where anyone could just drop in a photo or two and get anything up to a VR porn video.

        • CeeBeeEnglish
          arrow-up
          25
          arrow-down
          0
          ·
          7 months ago
          link
          fedilink

          And is the photo/video generator completely on home machines without any processing being done remotely already?

          Yes

          • roscoeEnglish
            arrow-up
            13
            arrow-down
            0
            ·
            7 months ago
            link
            fedilink

            Wellshit. It seems like any new laws are already too little too late then.

            • JDPoZEnglish
              arrow-up
              18
              arrow-down
              0
              ·
              7 months ago
              edit-2
              7 months ago
              link
              fedilink

              Stable Diffusion has been easily locally installed and runnable on any decent GPU for 2 years at this point.

              Combine that with Civitai.com for easy to download and run models of almost anything you can imagine - IP, celebrity, concepts, etc and the possibilities have been endless.

              In fact, with completely free apps like Draw Things on iOS, which allows you to run it on YOUR PHONE locally - where you can download models, tweak, customize, hand it images directly from your mobile device’s library making this stuff is now trivial on the go.

              • T156English
                arrow-up
                2
                arrow-down
                0
                ·
                7 months ago
                link
                fedilink

                Tensor processors/AI accelerators have also been a thing on new hardware for a while. Mobile devices have them, Intel/Apple include them with their processors, and it’s not uncommon to find them on newer graphics cards.

                That would just make it easier compared to needing quite a powerful computer for that kind of task.

    • neptuneEnglish
      arrow-up
      10
      arrow-down
      0
      ·
      7 months ago
      link
      fedilink

      I can paint as many nude images of Rihanna as I want.

      • yildolwEnglish
        arrow-up
        6
        arrow-down
        0
        ·
        7 months ago
        link
        fedilink

        You may be sued for damages if you sell those nude paintings of Rihanna at a large enough scale that Rihanna notices