What are the risks associated with this? With image uploading capabilities and the like I’m thinking there might be an issue with people posting highly illegal content. I used to run some smaller forums 15 years ago and that went fine, but it feels like the risks are higher today I’m both thinking about one’s own personal mental health in needing to moderate such content, and also whether it’ll be a legal liability to run an instance if people post illegal content.

  • empireOfLove2English
    arrow-up
    29
    arrow-down
    0
    ·
    3 days ago
    link
    fedilink

    Federation means any content posted to any federated instance gets cached on your side and you become a hoster of it.

    This includes if someone posts or creates an instance for child porn and starts spamming it. You’re possibly liable. and then have to deal with reviewing and cleaning it up to cover your ass.

    • ShideOPEnglish
      arrow-up
      23
      arrow-down
      0
      ·
      3 days ago
      link
      fedilink

      I’d love to support the fediverse but this sounds like a huge hassle and a problem. Maybe it’s just me though, I’m glad that there are others that have decided to host instances.

      • ScrubblesEnglish
        arrow-up
        21
        arrow-down
        0
        ·
        3 days ago
        link
        fedilink

        I never thought I’d be a registered CSAM reporter with the feds, but then I decided to host public content via Lemmy. Turns out, while 99.9% of users are great or fine, that 0.1% are just assholes for the sake of being assholes

        • QuacksalberEnglish
          arrow-up
          10
          arrow-down
          0
          ·
          3 days ago
          link
          fedilink

          I think Lemmy/Mbin would benefit from ‘moderation pools’. The basic idea is that, if you subscribe to or join a moderation pool, your instance will automatically copy any moderation action taken on content your instance also hosts. This would allow multiple single-admin instances to moderate even during off-hours of any single admin.

        • empireOfLove2English
          arrow-up
          9
          arrow-down
          0
          ·
          3 days ago
          link
          fedilink

          Hmm, this is something I haven’t heard about. Can you actually register as an instance hoster with the FBI or equivalent to say “hey I have a service that may be exposed to CSAM, I do not condone this and will report any cases of it that I see”? If so that could reduce a lot of people’s specific legal fears of hosting.

          • ScrubblesEnglish
            arrow-up
            6
            arrow-down
            0
            ·
            3 days ago
            link
            fedilink

            Not with the FBI, but with the national center for missing and exploited children, who collate reports and work with the FBI. Cloudflare and others have services that route all images through their detection systems and will auto block and report CSAM. I didn’t want to use cloudflare, but turns out if somehow I did accidentally host it, I would be charged with hosting it. I have to report it or I’m the responsible party

            • empireOfLove2English
              arrow-up
              1
              arrow-down
              0
              ·
              3 days ago
              edit-2
              3 days ago
              link
              fedilink

              That’s good to know. I’ve had some half baked plans to host a public instance for a while (will probably get to it in winter) and honestly the legal risk has been something that’s really held me back. Knowing I have a way to cover my ass for removing it is great.

          • ShideOPEnglish
            arrow-up
            5
            arrow-down
            0
            ·
            3 days ago
            link
            fedilink

            Unfortunately this isn’t applicable outside of the US in many cases, like in my case.

        • PringlesEnglish
          arrow-up
          9
          arrow-down
          0
          ·
          3 days ago
          link
          fedilink

          This is why I decided not to host an instance in the end. Where I live, the laws are such that the hoster is responsible for the content hosted on their servers So if some shitbag posts CP that gets synced to my server and the authorities somehow find out, it would seriously fuck up my life.

          • OpenStarsEnglish
            arrow-up
            5
            arrow-down
            0
            ·
            3 days ago
            link
            fedilink

            Not only do people avoid creating instances for this reason, but several previously existing instances shut down as a result, like DMV.social.

            • sunzu2
              arrow-up
              2
              arrow-down
              1
              ·
              20 hours ago
              link
              fedilink

              SO CSAM can be used as tool for suppressing

              • OpenStarsEnglish
                arrow-up
                2
                arrow-down
                0
                ·
                19 hours ago
                link
                fedilink

                Obviously, and as always has been. Bullying behaviors *work", or people (& animals) would not bother to expend the effort.

                • sunzu2
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  19 hours ago
                  link
                  fedilink

                  That’s a hellva mentally damaged threat actor but I guess it is effective

                  So how we know these are not a state doing it or let’s say social media competitors against each other?

        • slazer2auEnglish
          arrow-up
          2
          arrow-down
          0
          ·
          3 days ago
          link
          fedilink

          If you selfhost a single user instance do you still need to register? I get registering if you host a multiuser instance.

          • ScrubblesEnglish
            arrow-up
            2
            arrow-down
            0
            ·
            3 days ago
            link
            fedilink

            If it’s open to public, yes. Even if they don’t have an account if they can still see the offending content then yes.

            However, I bet if you use nginx you could somehow block public access and require an account. Something like if not login page and not has a token then block

            • abff08f4813c
              arrow-up
              2
              arrow-down
              0
              ·
              20 hours ago
              link
              fedilink

              I’m looking into doing this on my single-user instance. I’ve already modified the code so it doesn’t host images that get federated (it simply links to the URL on the original instance), but it would be good to lock things down a bit tighter.

              • ScrubblesEnglish
                arrow-up
                1
                arrow-down
                0
                ·
                17 hours ago
                link
                fedilink

                Now that they added image proxying I feel a lot better about it, but it’s still risky since it gets piped through my server

      • jjagaimoEnglish
        arrow-up
        13
        arrow-down
        0
        ·
        3 days ago
        edit-2
        3 days ago
        link
        fedilink

        There are some ways to mitigate the majority of that kind of stuff: You can disable image hosting, defederate from instances with poor moderation or poor attitudes, filter out certain keywords, use cleanup tools like from dbzero. Not sure if the caching still occurs if you disable pictrs hosting tho

        • ShideOPEnglish
          arrow-up
          2
          arrow-down
          0
          ·
          3 days ago
          link
          fedilink

          Is there a way to choose which instances you want to include, instead of which ones you want to exclude? So the default is that no instances except the ones you explicitly allow, federate with you? Or is this against the spirit of the fediverse?

          • empireOfLove2English
            arrow-up
            8
            arrow-down
            0
            ·
            3 days ago
            link
            fedilink

            By default a fresh new instance will federate with no other instances period.
            Instances only “learn” about the existence of an outside instance or community after a user enters a community+instance address in the search bar. After that, the home instance will sync with the remote instance and begin getting all new push data from that point on.

            I don’t know if Lemmy allows “whitelisting” of synced instances, such that it will auto synchronize with a provided list and ignore all others even if users search for them. I feel like it does, but I am not familiar enough with the backend to say yay or nay.

    • 1984English
      arrow-up
      3
      arrow-down
      0
      ·
      3 days ago
      edit-2
      3 days ago
      link
      fedilink

      I don’t think it’s as bad as you think. If someone uploads illegal content to any service, they remove it of course. Same thing on Lemmy.

      Original images and videos are not spread across instances, they stay on the instance they were uploaded on. However, thumbnails are distributed.

      • ValmondEnglish
        arrow-up
        1
        arrow-down
        0
        ·
        3 days ago
        link
        fedilink

        Images are definitely uploaded to Lemmy instances.

        • 1984English
          arrow-up
          4
          arrow-down
          1
          ·
          3 days ago
          link
          fedilink

          It’s uploaded to one instance and other instances are linking to that one for the picture.

          • drspodEnglish
            arrow-up
            3
            arrow-down
            0
            ·
            3 days ago
            link
            fedilink

            It’s pretty easy to check and see that this isn’t how it works. I checked both my instance and yours and both of them host the images that have been posted to communities on other instances, so clearly images are transferred (or cached) between instances.

            • ValmondEnglish
              arrow-up
              1
              arrow-down
              0
              ·
              3 days ago
              link
              fedilink

              Ya, I have a gazillion images in pictrs on my server, the overwhelming parg is absolutely not from my instance.

              Out of curiosy, what’s your instances? Mine is lemmy.mindoki.com :-)

              • 1984English
                arrow-up
                1
                arrow-down
                0
                ·
                3 days ago
                link
                fedilink

                Yeah those are the thumbnails I think?

                Ok I guess I better go look myself

                • ValmondEnglish
                  arrow-up
                  1
                  arrow-down
                  0
                  ·
                  3 days ago
                  link
                  fedilink

                  Very very large ones ;-)

                  Some are but the originals are definitely there too.

    • ShideOPEnglish
      arrow-up
      2
      arrow-down
      0
      ·
      3 days ago
      link
      fedilink

      Unfortunately I’m not based in the US.

  • Skull giverEnglish
    arrow-up
    5
    arrow-down
    0
    ·
    2 days ago
    link
    fedilink

    Legal risks: serious in theory, none in practice. Just don’t leave your instance running unmonitored for months and make it moderately difficult for bots to sign up. There are tools to detect child porn and other such material but they require a GPU which makes hosting a server quite a lot more expensive. Just make sure there’s a way to contact you in case of problematic material and you should be fine. Read up on your local laws about what’s expected of you as a hosting provider if you’re still in doubt, every country has its own requirements.

    As for mental health, it depends on what kind of people you attract. A group of friends visiting normal communities brings barely any mental toll, open signups will have you spend time every week banning and blocking abusive accounts and filtering out borderline porn content.