Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

    • ivanafterall
      arrow-up
      0
      arrow-down
      0
      ·
      1 year ago
      link
      fedilink

      This isn’t as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.

  • Striker
    arrow-up
    0
    arrow-down
    0
    ·
    1 year ago
    link
    fedilink

    I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don’t worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won’t. Dm me If you wish to apply for mod.

    Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

    • lwadminOPM
      arrow-up
      0
      arrow-down
      0
      ·
      1 year ago
      link
      fedilink

      @Striker@lemmy.world this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

      • Rob T FireflyEnglish
        arrow-up
        0
        arrow-down
        0
        ·
        1 year ago
        link
        fedilink

        Hopefully the devs will take the lesson from this incident and put some better tools together.

        • WhiskyTangoFoxtrot
          arrow-up
          0
          arrow-down
          0
          ·
          1 year ago
          link
          fedilink

          Or we’ll finally accept that the core Lemmy devs aren’t capable of producing a functioning piece of software and fork it.

          • BreadEnglish
            arrow-up
            0
            arrow-down
            0
            ·
            1 year ago
            link
            fedilink

            Its not easy to build a social media app, forking it won’t make it any easier to solve this particular problem. Joining forces to tackle an inevitable problem is the only solution. The Lemmy devs are more than willing to accept pull requests for software improvements.

        • Whitehat HackerEnglish
          arrow-up
          0
          arrow-down
          0
          ·
          1 year ago
          link
          fedilink

          There’s a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren’t already aware.

    • RightHandOfIkarosEnglish
      arrow-up
      0
      arrow-down
      0
      ·
      1 year ago
      link
      fedilink

      Trolls? In most regions of the planet, I am fairly certain their actions would be considered criminal.

    • expatriado
      arrow-up
      0
      arrow-down
      0
      ·
      1 year ago
      link
      fedilink

      troll is too mild of an adjective for these people

      • pensivepangolinEnglish
        arrow-up
        0
        arrow-down
        0
        ·
        1 year ago
        link
        fedilink

        Yeah honestly report all of those accounts to law enforcement. It’s unlikely they’d be able to do much, I assume, but these people are literally distributing CSAM.

      • FeathercrownEnglish
        arrow-up
        0
        arrow-down
        0
        ·
        1 year ago
        link
        fedilink

        How about “pedophile”? I mean, they had to have the images to post them.

        • jarfil
          arrow-up
          0
          arrow-down
          0
          ·
          1 year ago
          link
          fedilink

          “Terrorist”. Having the images doesn’t mean they liked them, they used them to terrorize a whole community though.

      • PM_Your_Nudes_Please
        arrow-up
        0
        arrow-down
        0
        ·
        1 year ago
        link
        fedilink

        Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

        The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go “oh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

        • gammasfor
          arrow-up
          0
          arrow-down
          0
          ·
          1 year ago
          link
          fedilink

          And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.

    • Whitehat HackerEnglish
      arrow-up
      0
      arrow-down
      0
      ·
      1 year ago
      link
      fedilink

      That’s not a troll, CSAM goes well beyond trolling, pedophile would be a more accurate term for them.

      • CoderKatEnglish
        arrow-up
        0
        arrow-down
        0
        ·
        1 year ago
        edit-2
        1 year ago
        link
        fedilink

        Yeah. A troll might post something like a ton of oversized images of pig buttholes. Who the fuck even has access to CSAM to post? That’s something you only have on hand if you’re a predator already. Nor is it something you can shrug off like “lol I was only trolling”. It’s a crime that will send you to jail for years. It’s a major crime that gets entire police units dedicated to it. It’s a huuuuge deal and I cannot even fathom what kind of person would risk years in prison to sabotage an internet forum.