• NOT_RICK@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    16 days ago

    Under moderated and under administrated instances have ended up with child porn on them. If that shit gets federated out it’s a real mess for everyone. I think screening tools are more advanced now thankfully because it’s been a while since the last incident.

      • PhobosAnomaly@feddit.uk
        link
        fedilink
        arrow-up
        16
        ·
        16 days ago

        Make a reliable way to automate that, and you’ll make a lot of money.

        Rely on doing it for yourself, and… well good luck with the mental health in a few years time.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          4
          arrow-down
          3
          ·
          16 days ago

          AI would be able to do a good first pass on it. Except that an AI that was able to reliably recognize child porn would be a useful tool for creating child porn, so maybe don’t advertise that you’ve got one on the job.

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          19
          ·
          edit-2
          16 days ago

          So that’s the indispensable service that admin provides. Childporn filtering.

          I didn’t realize it was such a large job. So large that it justifys the presence of a cop in every conversation? I dunno.

          • PhobosAnomaly@feddit.uk
            link
            fedilink
            arrow-up
            16
            arrow-down
            1
            ·
            16 days ago

            I’ve read through a few of your replies, and they generally contain a “so, …” and a generally inaccurate summary of what the conversation thread is about. I don’t know whether there’s a language barrier here or you’re being deliberately obtuse.

            It would appear to be that your desire for a community without moderators is so strong, that a platform like Lemmy is not suitable for what you want, and as such you are likely not going to find the answer you want here and spend your time arguing against the flow.

            Good luck finding what you’re looking for 👍

          • Zak@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            16 days ago

            If your questions are concrete and in the context of Lemmy or the Fediverse more broadly, admins provide the service of paying for and operating the servers in addition to moderation.

            If it’s more abstract, i.e. “can people talk to each other over the internet without moderators?” then my experience is that they usually can when the group is small, but things deteriorate is it grows larger. The threshold for where that happens is higher if the group has a purpose or if the people already know each other.

      • partial_accumen@lemmy.world
        link
        fedilink
        arrow-up
        11
        ·
        16 days ago

        Surely filtering out childporn is something that I can do for myself.

        Even if that was a viable option as a solution (it isn’t), humans that are employed to filter out this disgusting content (and worse) are frequently psychologically damaged by the exposure. This includes online content moderation companies and those in law enforcement that have to deal with that stuff for evidentiary reasons.

        The reason its not a viable solution is if YOU block it out because YOU don’t want to see it but its still there, it becomes a magnet for those that DO want to see it because they know its allowed. The value of the remaining legitimate content goes down because more of your time is spent blocking the objectionable material yourself, until its too much for anyone that doesn’t want that stuff and they leave. Then the community dies.

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          7
          ·
          16 days ago

          Personal cp filtering automation and a shared blacklist. That would take care of the problem. No moderator required.

          • db0@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            16 days ago

            Personal cp filtering automation and a shared blacklist

            Oh just those, eh?

            Just goes to show how little idea you have how difficult this problem is.

          • xmunk@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            ·
            16 days ago

            If you can write an automated filter to block CSAM then Apple, Meta, Alphabet and others would happily shovel billions at you. Blocking CSAM is a constant and highly expensive job… and when they fuck up it’s a PR shit storm.

      • Leraje@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        10
        ·
        16 days ago

        That means the CSAM (its not ‘child porn’ its child abuse) remains on the server, which means the instance owner is legally liable. Don’t know about you but if I was an instance owner I wouldn’t want the shame and legal consequences of leaving CSAM up on a server I control.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        16 days ago

        filtering out […] I can do for myself.

        It still means too much legal trouble for the admin if the offending data would be on the server.