• ocassionallyaduck
    arrow-up
    8
    arrow-down
    1
    ·
    5 months ago
    link
    fedilink

    The cats out of the bag on this. It’s enforceable for now to try and ban it, maybe. Because the models are mostly online and intensive.

    In 2028 though, when you can train your own model and generate your own local images without burning a server farm? This has to happen for ML to keep growing and catch on.

    welp. Then there is infinite fake child porn. Because you cannot police every device and model.

    Because of how tech companies have handled this technology, this is not an if scenario. This is guaranteed now.

    • Asafum
      arrow-up
      13
      arrow-down
      0
      ·
      5 months ago
      link
      fedilink

      Because you cannot police every device and model.

      FBI: “Challenge accepted. Hey Microsuck, let’s collaborate on a windows feature that records everything you do.

      Microsuck: “one step ahead of you. We already have it. (Not a joke.)

      • Karyoplasma
        arrow-up
        4
        arrow-down
        0
        ·
        5 months ago
        link
        fedilink

        You cannot force people to use Micro$oft. But I’m sure that it would only increase market share for them because it will be mediatized in a way that depicts non-privacy invading operating systems as morally evil because good guys don’t have anything to hide. Kinda like they did with pleading the fifth and shifting the public image of doing so being a silent admission to having committed a crime.

    • Gluten6970
      arrow-up
      4
      arrow-down
      1
      ·
      5 months ago
      link
      fedilink

      Uhhh these types of images kinda already require local models

    • TheObviousSolution
      arrow-up
      1
      arrow-down
      2
      ·
      5 months ago
      link
      fedilink

      I remember when they tried to do the same with CRISPR. Glad that didn’t take off and remained largely limited to the industry and academia. But then again, Wuhan