Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.

  • BlaiddEnglish
    arrow-up
    59
    arrow-down
    1
    ·
    9 months ago
    link
    fedilink

    Creating fake child porn of real people using things like Photoshop is already illegal in the US, I don’t see why new laws are required?

    • BgugiEnglish
      arrow-up
      39
      arrow-down
      2
      ·
      9 months ago
      edit-2
      9 months ago
      link
      fedilink

      Well those laws clearly don’t work. So we should make new laws! Ones that DEFINITELY WILL work! And if they don’t, well I guess we just need more laws until we find ones that do.

      • NotMyOldRedditNameEnglish
        arrow-up
        17
        arrow-down
        4
        ·
        9 months ago
        link
        fedilink

        Since we need a rule explicitly for AI related cases, even though it’s already covered by others, lets ensure that we also make a 100 page law for if the material is explicitly made in Photoshop, and also another 80 pages if it was made in Gimp. If you use MS Paint to do it, we need a special 200 page law that makes the punishment even harsher, because damn you got skillz and need to be punished more.

      • LWDEnglish
        arrow-up
        8
        arrow-down
        3
        ·
        9 months ago
        edit-2
        9 months ago
        link
        fedilink

        deleted

        • BgugiEnglish
          arrow-up
          11
          arrow-down
          2
          ·
          9 months ago
          link
          fedilink

          No, I’m not criticizing the bill’s content. If you don’t enforce laws, new ones won’t work either. The new ones are, at best, an opportunity for people to huff and puff and pat themselves on the back at the cost of actual victims. At worst, it’s smoke and mirrors for what the new law actually does.

          • LWDEnglish
            arrow-up
            2
            arrow-down
            4
            ·
            9 months ago
            edit-2
            9 months ago
            link
            fedilink

            deleted

    • General_EffortEnglish
      arrow-up
      12
      arrow-down
      0
      ·
      9 months ago
      link
      fedilink

      This is not at all about protecting children. That’s just manipulation. In truth, kids are more likely to prosecuted than protected by this bill.

      There are already laws that could be used against teen bullies but it’s rarely done. (IMHO it would create more harm than good, anyway.)

      This is part of an effort to turn the likenesses of people into intellectual property. Basically, it is about more money for the rich and famous.

      This bill would even apply to anyone who shares a movie with a sex scene in it. It’s enough that the “depiction” is “realistic” and “created or altered using digital manipulation”. Pretty much any photo nowadays, and certainly any movie, can be said to “altered using digital manipulation”. There’s no mention of age, deception, AI, or anything that the PR bullshit suggests.

    • k-radEnglish
      arrow-up
      11
      arrow-down
      2
      ·
      9 months ago
      link
      fedilink

      Regulatory capture. OpenAI wants to kick down the ladder