• fidodoEnglish
      arrow-up
      44
      arrow-down
      1
      ·
      5 months ago
      link
      fedilink

      You can ask it to make an image of a man made of pizza. That doesn’t mean it was trained on images of that.

      • dustyDataEnglish
        arrow-up
        4
        arrow-down
        14
        ·
        5 months ago
        link
        fedilink

        But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn’t in a sexual context.

        • bitwabaEnglish
          arrow-up
          8
          arrow-down
          1
          ·
          5 months ago
          link
          fedilink

          Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn’t trained on. Naked + child is just a simple equation for it to solve

    • MeanEYEEnglish
      arrow-up
      36
      arrow-down
      4
      ·
      5 months ago
      link
      fedilink

      You can always tell when someone has no clue about AI but has read online about it.

    • herrvogelEnglish
      arrow-up
      10
      arrow-down
      1
      ·
      5 months ago
      link
      fedilink

      The whole point of those generative models that they are very good at blending different styles and concepts together to create coherent images. They’re also really good at editing images to add or remove entire objects.

    • mightyfoolishEnglish
      arrow-up
      6
      arrow-down
      0
      ·
      5 months ago
      edit-2
      5 months ago
      link
      fedilink

      I think @deathbird@mander.xyz meant was the AI could be trained on what sex is and what children are at different points. Then a user request could put those two concepts together.

      But as the replies I got show, there were multiple ways this could have got accomplished. All I know is AI needs to go to jail.