Adobe recently updated its terms of use, and although companies do this all the time, these new changes have sparked a significant amount of discord and discussion among users.

The updated terms of use give Adobe access to any form of media uploaded to its Creative Cloud and Document Cloud services, a change which immediately sparked a privacy backlash and led to many users calling for a boycott. So annoyed were paying customers that Adobe was forced to issue a statement clarifying what the updated terms mean, and what they cover.

The changes Adobe made include switching the wording from “we will only access, view or listen to your content in limited ways” to “we may access, view or listen to your content” and the addition of “through both automated and manual methods”. In the Content section, Adobe made changes to how it would scan user data, adding the manual mention.

In its explanation of the terms changes, Adobe said, “To be clear, Adobe requires a limited license to access content solely for the purpose of operating or improving the services and software and to enforce our terms and comply with law, such as to protect against abusive content.

While the intentions behind these changes might be to enhance service quality and ensure compliance with legal standards, permitting the company to have such broad access to personal and potentially sensitive content clearly feels intrusive to many users. The shift from an explicit limitation to a more open-ended permission for content access could be seen as a step backward in terms of user control and data protection and raises concerns about privacy and user trust, which Adobe’s statement doesn’t fully address.

  • 555English
    arrow-up
    102
    arrow-down
    1
    ·
    4 months ago
    link
    fedilink

    Define illegal content. Rainbow flags in Russia or the Middle East?

    • WallExEnglish
      arrow-up
      53
      arrow-down
      2
      ·
      4 months ago
      link
      fedilink

      A picture of a man in front of tanks in China Some fictional bear

      • 555English
        arrow-up
        22
        arrow-down
        2
        ·
        4 months ago
        link
        fedilink

        I hear Winnie the Pooh is pretty controversial some places.

        • essellEnglish
          arrow-up
          15
          arrow-down
          1
          ·
          4 months ago
          link
          fedilink

          Well, if he’s going to walk about all the time without trousers, what does he expect?

    • irotsomaEnglish
      arrow-up
      24
      arrow-down
      1
      ·
      4 months ago
      link
      fedilink

      Yes, this is exactly what it’s for, as well as Winnie the Pooh in China, LGBTQ+ materials in Florida, or any other ridiculous laws. As fascism is taking over many countries, including the US, UK, and other Western countries, they are pressuring content storing companies to add backdoors to allow hunting down dissidents.

      Oh, and also this is a way to allow selling the content to train AI since it’s less obvious that it is allowed with this kind of vague wording.

      • 555English
        arrow-up
        5
        arrow-down
        0
        ·
        4 months ago
        link
        fedilink

        No doubt it’s about training AI on design. All these designers are putting themselves out of business. Nothing to be done about it. Corporations always win.

  • JeeBaiChowEnglish
    arrow-up
    60
    arrow-down
    1
    ·
    4 months ago
    link
    fedilink

    I’m not sure this is an argument for ignoring the tos, or one for scrutinizing the shit out of it. Why bother if they’re just gonna flip on you, when the software becomes part of your established/ preferred workflow? I want my perpetual standalone offline systems back. Never used ps since v4.

    • BluescreenOfDeathEnglish
      arrow-up
      28
      arrow-down
      0
      ·
      4 months ago
      link
      fedilink

      AFAIK, the unilateral nature of TOS/EULA agreements in the day of Software as a Service hasn’t been litigated. Which means there isn’t a court’s opinion on the scope or limits of a TOS/EULA and what changes can be made.

      Currently, Adobe has the full force of contract law to initiate this change without any input from consumers because a case about this has never made it to the courts.

      It’ll be interesting to see where this goes, but Adobe will likely backpedal on their language in the TOS before any case gets to a Judge because the last thing any company wants is for a TOS/EULA agreement to be fundamentally undermined by a court.

      • BearOfaTimeEnglish
        arrow-up
        17
        arrow-down
        0
        ·
        4 months ago
        link
        fedilink

        It’s interesting to see large organizations backpedalling when it’s clear things will head to court. Tells us they know their shit won’t stand up in court, and it would set a precedent making it easy for “ambulance chaser” lawyers to file a whole bunch of cases.

      • JeeBaiChowEnglish
        arrow-up
        6
        arrow-down
        1
        ·
        4 months ago
        link
        fedilink

        Yes, but why bother? They will just change it on you again. Better skip adobe altogether.

        • disguy_ovaheaEnglish
          arrow-up
          3
          arrow-down
          0
          ·
          4 months ago
          link
          fedilink

          Totally. I went to Affinity when Adobe went subscription.

          Just wanted to share TOS;DR. It’s a really handy free site for translating TOS and EULA.

  • CosmoNovaEnglish
    arrow-up
    28
    arrow-down
    0
    ·
    4 months ago
    link
    fedilink

    Well I know a lot of German companies that would look for alternatives immediately if their management actually used the internet and knew about this.

    • General_EffortEnglish
      arrow-up
      7
      arrow-down
      0
      ·
      4 months ago
      link
      fedilink

      The way it looks, Adobe has to do this to comply with EU law.

  • Autonomous UserEnglish
    arrow-up
    10
    arrow-down
    1
    ·
    4 months ago
    edit-2
    4 months ago
    link
    fedilink

    They control it, not us. It fails to include a libre software license text file, so what did you expect? 🤡🤡🤡

  • shapisEnglish
    arrow-up
    5
    arrow-down
    0
    ·
    4 months ago
    link
    fedilink

    Is not using Adobe a realistic option in professional settings atm?

    • grueEnglish
      arrow-up
      11
      arrow-down
      1
      ·
      4 months ago
      link
      fedilink

      Maybe, maybe not, but I would argue that even without viable alternatives, people in professional settings no longer have a choice. It is no longer possible to comply with Adobe’s ToS and many business clients’ confidentiality and/or exclusivity requirements at the same time.

    • MamboGatorEnglish
      arrow-up
      6
      arrow-down
      0
      ·
      4 months ago
      edit-2
      3 months ago
      link
      fedilink

      deleted by creator

      • ripcordEnglish
        arrow-up
        5
        arrow-down
        0
        ·
        4 months ago
        link
        fedilink

        Although there are people using those professionally, they’re definitely not ones that generally ever get recommended for it. And gimp in particular IMO kinda sucks although it has a lot of power.

        A big one that’s getting a lot of steam is the Affinity Suite.

        • MamboGatorEnglish
          arrow-up
          1
          arrow-down
          0
          ·
          4 months ago
          edit-2
          3 months ago
          link
          fedilink

          deleted by creator

  • corrodedEnglish
    arrow-up
    5
    arrow-down
    0
    ·
    4 months ago
    link
    fedilink

    Is Creative Cloud a requirement for using Adobe products these days? Surely someone can just save data locally instead. Media files (especially raw video) can be enormous.

    • ObiEnglish
      arrow-up
      4
      arrow-down
      0
      ·
      4 months ago
      link
      fedilink

      Yeah I save everything locally, I technically have a small bit of cloud included with my plan (the smallest/cheapest one you can get, LR+PS), but I never use it. I’ve moved to alternatives for everything else but still stuck with LR+PS for the photo work for now

    • MamboGatorEnglish
      arrow-up
      3
      arrow-down
      0
      ·
      4 months ago
      edit-2
      3 months ago
      link
      fedilink

      deleted by creator

  • just_another_personEnglish
    arrow-up
    15
    arrow-down
    11
    ·
    4 months ago
    link
    fedilink

    I’m positive they got notified they were hosting a massive amount of CSAM, or similarly awful AI generated shit since it’s the Wild West out there now. This was their only way out.

    • sabinEnglish
      arrow-up
      26
      arrow-down
      1
      ·
      4 months ago
      link
      fedilink

      Sounds like a smokescreen to me. All file sharing services have this problem. The solution is to respond to subpoena requests and let the government do their jobs. They do not have to allow themselves to arbitrarily violate their users privacy in order to do that.

      • just_another_personEnglish
        arrow-up
        1
        arrow-down
        24
        ·
        4 months ago
        edit-2
        4 months ago
        link
        fedilink

        No, they don’t. If you’re storing something that is found by a law enforcement agency, you are legally liable. That’s the difference.

        You can’t just say out loud “Hey users, please stop storing CSAM on our servers. Not how that works.

        • sabinEnglish
          arrow-up
          13
          arrow-down
          1
          ·
          4 months ago
          link
          fedilink

          Adobe is not a video distribution platform. They do not have this level of culpability.

          • just_another_personEnglish
            arrow-up
            1
            arrow-down
            9
            ·
            4 months ago
            link
            fedilink

            Adobe CLOUD requires storage of images and video on their servers to edit them. That’s what this is about.

            • sabinEnglish
              arrow-up
              10
              arrow-down
              0
              ·
              4 months ago
              link
              fedilink

              That’s not the same as content distribution.

              Sharing content to clients cannot be effectively done through creative cloud.

              It does not make sense to try and stop the distribution at the level of video editing. Not only is the thought of child predators making regular use of professional editing software completely absurd, but even if you assume they do, why the fuck do you think they would use the inbuilt cloud sharing tools to do so?? They would just encrypt the contents and transmit it over any other file sharing service

              It makes no sense to implement this measure because it does absolutely nothing to impede criminals, but enables a company well known for egregious privacy violations unprecedented access to information completely law abiding clients have legitimate reasons to want to keep private.

              It is a farce. A smokescreen intended to encroach on customers precious data all the while doing nothing to assist law enforcement.

    • greenskyeEnglish
      arrow-up
      7
      arrow-down
      2
      ·
      4 months ago
      link
      fedilink

      I realize it’s gross and icky and morally problematic, but I really wonder if trying to have the government crackdown on AI generated CSAM is worth the massive risk to freedom of speech and privacy that it seems like it’s going to open us up to. It’s a massive blank check to everyone to become a big brother.

      • just_another_personEnglish
        arrow-up
        1
        arrow-down
        6
        ·
        4 months ago
        link
        fedilink

        There are no laws about it anywhere right now, but I’m sure it’s about something more real. As this has played out many times in the past (Amazon, Apple, Google FBetc) across many different policing agencies: if they identify a HUGE host that is a problem, they notify them first and allow them to address the issue before making themselves known and cracking down on the offenders. This just feels like that same thing again.

        AI or not, if a court can prosecute a case, they’ll do so.