• kromemEnglish
    arrow-up
    69
    arrow-down
    3
    ·
    5 months ago
    link
    fedilink

    But, I’m still torn on the first scenario

    To me it comes down to a single question:

    “Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?

    If there’s a reduction effect by providing an outlet for arousal that isn’t actually harming anyone - that sounds like a pretty big win.

    If there’s a force multiplier effect where exposure and availability means it’s even more of an obsession and focus such that there’s increased likelihood to harm children, then society should make the AI generated version illegal too.

    • TheDoozer
      arrow-up
      52
      arrow-down
      0
      ·
      5 months ago
      link
      fedilink

      Hoooooly hell, good luck getting that study going. No ethical concerns there!

      • ricecake
        arrow-up
        13
        arrow-down
        0
        ·
        5 months ago
        link
        fedilink

        How they’ve done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.

        It’s not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.

    • HonoraryMancunianEnglish
      arrow-up
      19
      arrow-down
      0
      ·
      5 months ago
      link
      fedilink

      I’m willing to bet it’ll differ from person to person, to complicate matters further

    • state_electrician
      arrow-up
      17
      arrow-down
      2
      ·
      5 months ago
      link
      fedilink

      I think the general consensus is that availability of CSAM is bad, because it desensitizes and makes harming of actual children more likely. But I must admit that I only remember reading about that and don’t have a scientific source.