• PM_Your_Nudes_PleaseEnglish
    arrow-up
    28
    arrow-down
    2
    ·
    5 months ago
    link
    fedilink

    Yeah, it’s very similar to the “is loli porn unethical” debate. No victim, it could supposedly help reduce actual CSAM consumption, etc But it’s icky so many people still think it should be illegal.

    There are two big differences between AI and loli though. The first is that AI would supposedly be trained with CSAM to be able to generate it. An artist can create loli porn without actually using CSAM references. The second difference is that AI is much much easier for the layman to create. It doesn’t take years of practice to be able to create passable porn. Anyone with a decent GPU can spin up a local instance, and be generating within a few hours.

    In my mind, the former difference is much more impactful than the latter. AI becoming easier to access is likely inevitable, so combatting it now is likely only delaying the inevitable. But if that AI is trained on CSAM, it is inherently unethical to use.

    Whether that makes the porn generated by it unethical by extension is still difficult to decide though, because if artists hate AI, then CSAM producers likely do too. Artists are worried AI will put them out of business, but then couldn’t the same be said about CSAM producers? If AI has the potential to run CSAM producers out of business, then it would be a net positive in the long term, even if the images being created in the short term are unethical.

    • Ookami38English
      arrow-up
      24
      arrow-down
      1
      ·
      5 months ago
      link
      fedilink

      Just a point of clarity, an AI model capable of generating csam doesn’t necessarily have to be trained on csam.

        • Ookami38English
          arrow-up
          6
          arrow-down
          1
          ·
          5 months ago
          edit-2
          5 months ago
          link
          fedilink

          Why is that? The whole point of generative AI is that it can combine concepts.

          You train it on the concept of a chair using only red chairs. You train it on the color red, and the color blue. With this info and some repetition, you can have it output a blue chair.

          The same applies to any other concepts. Larger, smaller, older, younger. Man, boy, woman, girl, clothed, nude, etc. You can train them each individually, gradually, and generate things that then combine these concepts.

          Obviously this is harder than just using training data of what you want. It’s slower, it takes more effort, and results are inconsistent, but they are results. And then, you curate the most viable of the images created this way to train a new and refined model.

          • Todd BonzalezEnglish
            arrow-up
            4
            arrow-down
            0
            ·
            5 months ago
            link
            fedilink

            Yeah, there are photorealistic furry photo models, and I have yet to meet an anthropomorphic dragon IRL.

    • JovialMicrobialEnglish
      arrow-up
      3
      arrow-down
      0
      ·
      5 months ago
      link
      fedilink

      I think one of the many problems with AI generated CSAM is that as AI becomes more advanced it will become increasingly difficult for authorities to tell the difference between what was AI generated and what isn’t.

      Banning all of it means authorities don’t have to sift through images trying to decipher between the two. If one image is declared to be AI generated and it’s notwell that doesn’t help the victims or create less victims. It could also make the horrible people who do abuse children far more comfortable putting that stuff out there because it can hide amongst all the AI generated stuff. Meaning authorities will have to go through far more images before finding ones with real victims in it. All of it being illegal prevents those sorts of problems.

      • PM_Your_Nudes_PleaseEnglish
        arrow-up
        2
        arrow-down
        0
        ·
        5 months ago
        edit-2
        5 months ago
        link
        fedilink

        And that’s a good point! Luckily it’s still (usually) fairly easy to identify AI generated images. But as they get more advanced, that will likely become harder and harder to do.

        Maybe some sort of required digital signatures for AI art would help; Something like a public encryption key in the metadata, that can’t be falsified after the fact. Anything without that known and trusted AI signature would by default be treated as the real deal.

        But this would likely require large scale rewrites of existing image formats, if they could even support it at all. It’s the type of thing that would require people way smarter than myself. But even that feels like a bodged solution to a problem that only exists because people suck. And if it required registration with a certificate authority (like an HTTPS certificate does) then it would be a hurdle for local AI instances to jump through. Because they would need to get a trusted certificate before they could sign their images.

    • KalciferEnglish
      arrow-up
      2
      arrow-down
      1
      ·
      5 months ago
      edit-2
      5 months ago
      link
      fedilink

      But it’s icky so many people still think it should be illegal.

      Imo, not the best framework for creating laws. Essentially, it’s an appeal to emotion.

      • PM_Your_Nudes_PleaseEnglish
        arrow-up
        16
        arrow-down
        1
        ·
        5 months ago
        edit-2
        5 months ago
        link
        fedilink

        I wasn’t arguing about current laws. I was simply arguing about public perception, and whether the average person believes it should be illegal. There’s a difference between legality and ethicality. Something unethical can be legal, and something illegal can be ethical.

        Weed is illegal, but public perception says it shouldn’t be.

        • uisEnglish
          arrow-up
          2
          arrow-down
          2
          ·
          5 months ago
          link
          fedilink

          Weed is illegal, but public perception says it shouldn’t be.

          Alcohol is worse then weed, yet alcohol is not banned.

          • Todd BonzalezEnglish
            arrow-up
            3
            arrow-down
            1
            ·
            5 months ago
            link
            fedilink

            The mitochondria is the powerhouse of the cell.

        • PirateJesusEnglish
          arrow-up
          1
          arrow-down
          0
          ·
          5 months ago
          link
          fedilink

          Everybody is American. They just don’t know it yet.

          Gospel of the Jesus