Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

  • krayj@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    74
    arrow-down
    4
    ·
    edit-2
    1 year ago

    How does closing lemmyshitpost do anything to solve the issue? Isn’t it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?

    • Cabrio@lemmy.world
      link
      fedilink
      arrow-up
      48
      arrow-down
      10
      ·
      1 year ago

      It stops their instance hosting CSAM and removes their legal liability to deal with something they don’t have the capacity to at this point in time.

      How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

      • krayj@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        34
        arrow-down
        9
        ·
        edit-2
        1 year ago

        How would you respond to having someone else forcibly load up your pc with child porn over the Internet? Would you take it offline?

        But that’s not what happened. They didn’t take the server offline. They banned a community. If some remote person had access to my pc and they were loading it up with child porn, I would not expect that deleting the folder would fix the problem. So I don’t understand what your analogy is trying to accomplish because it’s faulty.

        Also, I think you are confusing my question as some kind of disapproval. It isn’t. If closing a community solves the problem then I fully support the admin team actions.

        I’m just questioning whether that really solves the problem or not. It was a community created on Lemmy.world, not some other instance. So if the perpetrators were capable of posting to it, they are capable of posting to any community on lemmy.world. You get that, yeah?

        My question is just a request for clarification. How does shutting down 1 community stop the perpetrators from posting the same stuff to other communities?

        • Ghostalmedia@lemmy.world
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          3
          ·
          1 year ago

          Fact of the matter is that these mods are not lawyers, and even if they were not liable, they would not have the means to fight this in court if someone falsely, or legitimately, claimed they were liable. They’re hobbits with day jobs.

          I also mod a few large communities here, and if I’m ever in that boat, I would also jump. I have other shit to do, and I don’t have the time or energy to fight trolls like that.

          If this was Reddit, I’d let all the paid admins, legal, PR, SysOps, engineers and UX folks figure it out. But this isn’t Reddit. It’s all on the hobbyist mods to figure it out. Many are not going to have the energy to put up with it.

          • krayj@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            6
            ·
            1 year ago

            How does it limit liability when they could continue posting that content to any/every other community on lemmy.world?

            • Cabrio@lemmy.world
              link
              fedilink
              arrow-up
              13
              arrow-down
              4
              ·
              edit-2
              1 year ago

              But it does remove the immediate issue of CSAM coming from shitpost so world isn’t hosting that content.

              • Double_A
                link
                fedilink
                arrow-up
                6
                arrow-down
                2
                ·
                1 year ago

                Shitpost is not the only community on World Ffs!

                • stealthnerd@lemmy.world
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  1 year ago

                  They’re taking a whack-a-mole approach for sure but it’s either that or shut the whole instance down. I imagine their hope is that either the bad guys give up/lose interest or that it buys them some time.

                  Either way, it shows they are taking action which ultimately should help limit their liability.

    • Whitehat Hacker@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      2
      ·
      1 year ago

      They also changed the account sign ups to be application only so people can’t create accounts without being approved.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      It doesn’t solve the bigger moderation problem, but it solves the immediate issue for the mods who don’t want to go to jail for modding a community hosting CSM.

      • krayj@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        1 year ago

        Doesn’t that send a clear message to the perpetrators that they can cause any community to be shut down and killed and all they have to do is post CSAM to it? What makes you or anyone else think that, upon seeing that lemmyshitpost is gone, that the perpetrators will all just quit. Was lemmyshitpost the only community they were able to post in?

        • Ghostalmedia@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          arrow-down
          4
          ·
          1 year ago

          Yup. The perpetrators win.

          If you were in their shoes, would you want to risk going to jail for kiddy porn, risk having your name associated with CSM online, or drain your personal savings account to fight these folks?

          These mods are not protected by a well funded private legal team. This isn’t Reddit.

          • krayj@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            6
            ·
            1 year ago

            You don’t have to explain how liability works. I get it. What I don’t get is how removing that specific community is going to limit their liability when the perpetrators will just target a different community.

            • Whitehat Hacker@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              ·
              1 year ago

              Sign-ups are manual approval applications, no more automated sign-ups from them, if they have existing accounts and target another community it’ll be closed as well and those accounts banned, there isn’t a stream of new accounts though because all accounts going forward need to be manually approved.

            • ttmrichter@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              One of the ways you avoid liability is you show that you’re actively taking measures to prevent illegal content.

        • MsPenguinette@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          The perps are taking a big risk as well. Finding and uploading csam means being in possession of it. So we can at least take solace in knowing it’s not a tool that just anyone wiill use to take down a community.

          Uploading to websites counts as distribution. The authorities will actually care about this. It’s not just some small thing that is technically a crime. It’s big time crime being used for skme thing petty.

          So while the perp might win in the short term, they are risking their lives using this tactic. I’m not terribly worried about it becoming a common tactic

          I’d anything, if I were the one doing this, I’d be worried that I might be pissing off the wrong group of people. If they keep at it and become a bigger problem, everyone is going to be looking for them. And then that person is going to big boy prison.

          • krayj@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            That is a great point. I don’t know if the admin team are proactively reporting that activity to law enforcement, but I hope they are.