• fidodoEnglish
    arrow-up
    44
    arrow-down
    1
    ·
    5 months ago
    link
    fedilink

    You can ask it to make an image of a man made of pizza. That doesn’t mean it was trained on images of that.

    • dustyDataEnglish
      arrow-up
      4
      arrow-down
      14
      ·
      5 months ago
      link
      fedilink

      But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn’t in a sexual context.

      • bitwabaEnglish
        arrow-up
        8
        arrow-down
        1
        ·
        5 months ago
        link
        fedilink

        Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn’t trained on. Naked + child is just a simple equation for it to solve