• Coskii
    arrow-up
    41
    arrow-down
    5
    ·
    6 months ago
    link
    fedilink

    I’ve said it many times, but the channels I speak through are small, so from the top!

    If you put your artwork online in any public location, make sure your signature or even a QR code is obnoxiously large and centered on the image. Humans can still see and enjoy what you’ve made, AI won’t be able to discern anything, and if it happens to get ripped by one of those Chinese T-shirt bots, at least anyone who buys will know who the original artist is.

    • jsomae
      arrow-up
      51
      arrow-down
      1
      ·
      6 months ago
      edit-2
      6 months ago
      link
      fedilink

      TIL that there exist people who aren’t bothered by obnoxious watermarks superimposed on an image. I find them aggravating, and I’m not the only one – That’s shutterstock’s entire business model.

      AI is already making people’s lives worse. Let’s not make human art harder to enjoy in a fruitless effort to resist it. Instead, let’s solve the root of the problem.

      • Coskii
        arrow-up
        10
        arrow-down
        1
        ·
        6 months ago
        link
        fedilink

        It’s not that I prefer having images occluded by anything, signatures, text boxes, or whatever But when it comes to online protections for someone’s work, hell yeah put that shit on there.

        The best part is that I’ve been saying this well before generative AI was mainstream. Artists who put their work on public domains who don’t want it getting into the hands of others shouldn’t have an issue with signing the hell out of the image. They can of course add it before uploading and not to the original.

        Would it be amazing if people properly lisenced others work and/or requested permission to use it? Absolutely. That’s just not the world we live in.

        • Boy of Soy
          arrow-up
          9
          arrow-down
          1
          ·
          6 months ago
          link
          fedilink

          This still seems like a crazy take to me. Yeah, putting a giant watermark on a piece of art protects it from theft, but it also destroys the artwork.

          • trevorEnglish
            arrow-up
            3
            arrow-down
            0
            ·
            6 months ago
            link
            fedilink

            Unregistered HyperCam 2

      • LWD
        arrow-up
        2
        arrow-down
        0
        ·
        6 months ago
        link
        fedilink

        The root of the problem needs to be solved within the next negative six months, and the millionaires pushing/operating it sure don’t seem interested.

    • FiveMacs
      arrow-up
      11
      arrow-down
      1
      ·
      6 months ago
      link
      fedilink

      Hey chatgpt or whatever ai model, recreate this image without the silly QR code.

      • SmoothLiquidationEnglish
        arrow-up
        5
        arrow-down
        0
        ·
        6 months ago
        link
        fedilink

        This is the big thing. All doing silly things like obscene QR codes does is add training data for future ai to remove them.

    • BubbleMonkey
      arrow-up
      3
      arrow-down
      0
      ·
      6 months ago
      link
      fedilink

      A really fun side effect of stuff like this is when you generate something that looks like a pencil sketch or something, you’ll often get partial pencils in the middle or upper corner of the image because they are quite often photod with pencils on them to indicate the medium.

      So even something that simple is sort of poisoning the models. And if they all have that obnoxious signature or QR code, the generators are going to start including those and that’s just gold.

      • jsomae
        arrow-up
        1
        arrow-down
        0
        ·
        6 months ago
        link
        fedilink

        I don’t really think that’s poisoning much. It’s not hard to crop out the pencil after.

        • BubbleMonkey
          arrow-up
          1
          arrow-down
          0
          ·
          6 months ago
          link
          fedilink

          It is definitely difficult to get rid of when it’s generated in the middle of intricate detail, which it often is.

          I’m not saying it’s the same thing as actually poisoning, but it does negatively impact the resulting generations.

          • jsomae
            arrow-up
            1
            arrow-down
            0
            ·
            6 months ago
            link
            fedilink

            If it’s in the middle of intricate detail it will make it harder to appreciate that detail as a human.

            Anyway, it’s easy to make an AI to remove such things. Just take a million images, add watermarks, and train the AI to produce the original images.

  • GrappleHatEnglish
    arrow-up
    24
    arrow-down
    1
    ·
    6 months ago
    link
    fedilink

    I’m very skeptical that this “model poisoning” approach will work in practice. To pull it off would require a very high level of coordination among disparate people generating the training data (the images/text). I just can’t imagine it happening. Add to that: big tech has A LOT of resources to play this cat & mouse game.

    I hope I’m wrong, but I predict big tech wins here.

    • General_Effort
      arrow-up
      3
      arrow-down
      0
      ·
      6 months ago
      link
      fedilink

      This attack doesn’t target Big Tech, at all. The model has to be open to pull off an attack like that.

  • catloafEnglish
    arrow-up
    18
    arrow-down
    1
    ·
    6 months ago
    link
    fedilink

    No, because a method that works on one implementation almost certainly doesn’t work on another.

  • General_Effort
    arrow-up
    12
    arrow-down
    4
    ·
    6 months ago
    link
    fedilink

    This doesn’t have anything to do with tracking. This is supposed to sabotage free and open image generators (ie stable diffusion). It’s unlikely to do anything, though.

    Hard to say what the makers want to achieve with this. Even if it did work, it would help artists just as much, as better DRM would help programmers. On its face, this is just about enforcing some ultra-capitalist ideology that wants information to be owned.

    • CheeseNoodleEnglish
      arrow-up
      8
      arrow-down
      3
      ·
      6 months ago
      edit-2
      6 months ago
      link
      fedilink

      I see it as trying to combat the dystopia where not only is our data scraped but now every single thing we write, draw or film is fed into an AI that will ultimately be used to create huge amounts of wealth for very few, essentially monetizing our very existence online in a way thats entierly unavoidable and without consent.

      In addition its entierly one way, google and others can grab as much of our data as they want while most of us would have an extremely hard time even getting granted a freedom of information request about ourselves, let alone grabbing a similar amount of data about those same corporations.

      • General_Effort
        arrow-up
        3
        arrow-down
        3
        ·
        6 months ago
        link
        fedilink

        that will ultimately be used to create huge amounts of wealth for very few,

        But That is what these poisoning attacks are fighting for. They are attacking open image generators that can be used by anyone. You can use them for fun or for business, without having to pay rent to some owner who is not lifting a finger. What do you think will happen if you knock that out?

        • Amerikan Pharaoh
          arrow-up
          2
          arrow-down
          0
          ·
          6 months ago
          link
          fedilink

          If it uses my data and hasn’t paid for my data, it’s stealing from me. You don’t get to have it both ways; either we can have a communist system where I don’t need to worry about my bottom line anymore; or we can have this capitalist bullshit and you can fuckin pay me for every time your machine’s data-gripper reaches into my metaphorical pockets.

  • Zerush
    arrow-up
    4
    arrow-down
    0
    ·
    6 months ago
    edit-2
    6 months ago
    link
    fedilink

    For image tracking it’s enough to use Imgur for sharing, for any image, even own ones, no AI image needed. I miss the bot in Lemmy which redirects Videos to Piped, when Imgur is worst. Better alternatives, like File Coffee or Vgy.me, made in the EU are desirable.

    https://file.coffee/u/6nbYJBhepr48GVm95McgK.png

  • darkphotonstudio
    arrow-up
    2
    arrow-down
    0
    ·
    6 months ago
    link
    fedilink

    Yes, we need more artists defending capitalism with futile, annoying, and inaffective attempts at DRM. I guess we didn’t learn anything from the music DRM wars in the 00s.

  • onlinepersonaEnglish
    arrow-up
    1
    arrow-down
    2
    ·
    6 months ago
    link
    fedilink

    At the moment, I’m just adding the license to my text, but if somebody has something I could copypaste and put into a spoiler to poison AI training, that’d be great.

    Anti Commercial-AI license

    Insert poison pill here

    Nothing here yes!

    • fine_sandy_bottom
      arrow-up
      2
      arrow-down
      1
      ·
      6 months ago
      link
      fedilink

      This reminds me of when I was 10.

      I thought it was cool to draw the copyright symbol and year on the dumb drawings I made

    • VeganCheesecake
      arrow-up
      1
      arrow-down
      0
      ·
      6 months ago
      link
      fedilink

      One thing I was kinda wondering about - as long as there’s nothing in the T&Cs of your instance, don’t you implicitly hold the copyright to your comment? Isn’t the CC license actually more permissive? Or is it more about “that model was trained on content available under this license, to comply with it, they have to follow it’s terms”?

      • onlinepersonaEnglish
        arrow-up
        2
        arrow-down
        1
        ·
        6 months ago
        link
        fedilink

        Or is it more about “that model was trained on content available under this license, to comply with it, they have to follow it’s terms”?

        Close. Creative Commons is a copyleft license with restrictions. The important restriction in this case is not allowing commercial use.

        Anti Commercial-AI license

        • VeganCheesecake
          arrow-up
          3
          arrow-down
          0
          ·
          6 months ago
          edit-2
          6 months ago
          link
          fedilink

          - but explicitly allowing non-commercial use. Neat.