• DeelloEnglish
    arrow-up
    37
    arrow-down
    0
    ·
    2 months ago
    link
    fedilink

    I mean yes but that’s like saying Bitcoin is used by criminals to buy drugs and weapons. The problem is that’s not their only use.

    • Supermariofan67English
      arrow-up
      15
      arrow-down
      0
      ·
      2 months ago
      link
      fedilink

      Wait till you hear about the idiots who unironically make that argument for banning Bitcoin too

    • Baut [she/her] auf.English
      arrow-up
      9
      arrow-down
      0
      ·
      2 months ago
      link
      fedilink

      Bitcoin is a bad example, since it’s not designed as a private currency. Monero/XMR is actually usable.

    • WarmApplePieShrekEnglish
      arrow-up
      1
      arrow-down
      0
      ·
      2 months ago
      link
      fedilink

      That’s like saying Voat isn’t only used by incel trolls who got banned from reddit

    • WarmApplePieShrekEnglish
      arrow-up
      1
      arrow-down
      0
      ·
      2 months ago
      link
      fedilink

      That’s like saying Voat isn’t only used by incel trolls who got banned from reddit

    • stonerbonerEnglish
      arrow-up
      8
      arrow-down
      11
      ·
      2 months ago
      link
      fedilink

      Yep. The issue is that they put out a tool that does some good things, but is also heavily adopted by criminals who piggyback on it.

      Should we let child abuse just proliferate with these tools, because there’s so much need for privacy? How do you weed out the bad without kneecapping the good? There’s no good answer here. The good parts of the tech working enable the bad parts, too.

      There has to be a certain level of knowledge and acceptance of the bad parts to continue developing it. It’s a catch 22, so law enforcement has to pick between sacrificing the privacy or allowing a tool to exist that proliferates child abuse material and other ills.

      There are valid arguments for the importance of privacy, and valid arguments for making sure there these crimes shouldn’t have a safe haven. Action to either end will hurt some people and enrage others.

      • GrimpenEnglish
        arrow-up
        9
        arrow-down
        0
        ·
        2 months ago
        link
        fedilink

        The standard I recall being established back in the nineties as to whether strong encryption was even legal in the US was “substantial non-infringing use” or similar. It’s been awhile.

        The problem with key-escrow or anything similar is that any proscribed circumvention is also available to the “bad guys”.

        I think Telegram’s stance would be that they can’t moderate because of strong end-to-end encryption. Back in the day the parallel would have been made to the phone system or mail.

        Of course this is all happening in France, so I have no idea what the combination of French and EU laws will have on this, but I would still broadly expect that if a parallel can be made to mail or phone, Telegram would be in the clear. The phone company and mail service have no expectation of content moderation.

        I guess we’ll see.

        • stonerbonerEnglish
          arrow-up
          3
          arrow-down
          2
          ·
          2 months ago
          link
          fedilink

          The huge difference between mail or phone and telegram is that both mail and phone work with law enforcement, with useful records being made available upon subpoena. Telegram, by design, will not.

          If you think drawing that parallel is useful to Telegram, they would then also be required to maintain the same standards of security as the mail, with package inspections, drug dogs, entire teams of government officials investigating illegal activities etc.

          The criminals use it precisely because it is not a parallel to other available channels, as it circumvents those safeguards.