Surely filtering out childporn is something that I can do for myself.
Even if that was a viable option as a solution (it isn’t), humans that are employed to filter out this disgusting content (and worse) are frequently psychologically damaged by the exposure. This includes online content moderation companies and those in law enforcement that have to deal with that stuff for evidentiary reasons.
The reason its not a viable solution is if YOU block it out because YOU don’t want to see it but its still there, it becomes a magnet for those that DO want to see it because they know its allowed. The value of the remaining legitimate content goes down because more of your time is spent blocking the objectionable material yourself, until its too much for anyone that doesn’t want that stuff and they leave. Then the community dies.
Well when one runs an instance of 1000+ mau they need to accept it’s not “their house” anymore. Those that don’t, quickly end up in shit storms against their communities 😅
If you can write an automated filter to block CSAM then Apple, Meta, Alphabet and others would happily shovel billions at you. Blocking CSAM is a constant and highly expensive job… and when they fuck up it’s a PR shit storm.
Even if that was a viable option as a solution (it isn’t), humans that are employed to filter out this disgusting content (and worse) are frequently psychologically damaged by the exposure. This includes online content moderation companies and those in law enforcement that have to deal with that stuff for evidentiary reasons.
The reason its not a viable solution is if YOU block it out because YOU don’t want to see it but its still there, it becomes a magnet for those that DO want to see it because they know its allowed. The value of the remaining legitimate content goes down because more of your time is spent blocking the objectionable material yourself, until its too much for anyone that doesn’t want that stuff and they leave. Then the community dies.
Personal cp filtering automation and a shared blacklist. That would take care of the problem. No moderator required.
It is illegal in most countries to host child sexual abuse material.
Oh just those, eh?
Just goes to show how little idea you have how difficult this problem is.
This is starting to sound like, “we need constant control and surveillance to protect us from the big bad”.
You know, for the children.
Mate, if you don’t like the way we run things, go somewhere else. You’re not forced to be here.
Bro is talking smack to their own instance host 💀
It’s ok, I don’t mind people challenging me.
And you are magnanimous for doing so. If someone came into my house and tried to dictate their rules to me, I’d be fuming.
This is why I’m not a mod 🤡
Well when one runs an instance of 1000+ mau they need to accept it’s not “their house” anymore. Those that don’t, quickly end up in shit storms against their communities 😅
If you can write an automated filter to block CSAM then Apple, Meta, Alphabet and others would happily shovel billions at you. Blocking CSAM is a constant and highly expensive job… and when they fuck up it’s a PR shit storm.