Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

  • Kangie@lemmy.srcfiles.zip
    link
    fedilink
    English
    arrow-up
    24
    ·
    1 year ago

    A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

    I hate to say it, but you’ll need to find a text-only platform. Allowing any image uploads opens the door to things like this.

    Besides that, if your concern is that no moderator should be exposed to anything like that, well on a text-only site you might have to deal with disguised spam links to gore, scam, etc. You’ll still have to click on links to effectively moderate.

    Maybe you should consider if this is a position that you want to put yourself in again. It sounds like this may just not be for you.

    • Chobbes@beehaw.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      This was my immediate thought as well. It’s unfortunate, but there will probably always be people who abuse online platforms like this. It’s totally okay if you’re not up to the task of moderating disturbing content like that — it sounds like it can be a really brutal job. I don’t know what the moderation tools on Lemmy are like, but maybe there’s a way to flag different kinds of moderation concerns for different moderators (so not everybody has to be exposed to this kind of stuff if they’re not comfortable with it). And maybe there could also be a system where if user’s flag the post it can be automatically marked as NSFW and images can be hidden by default so moderators and other users don’t have to be exposed to it without warning (though of course such a system could potentially be abused as well). But beyond that I’m not sure what else you can do, aside from maybe limiting federation.