• Geetnerd@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    6 days ago

    Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.

    I’m conflicted on that. Naturally, I’m disgusted, and repulsed. I AM NOT ADVOCATING IT.

    But if no real child is harmed…

    I don’t want to think about it, anymore.

    • ZILtoid1991@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      6 days ago

      Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the “no real children were harmed” part not necessarily 100% true.

      Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      6 days ago

      that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.

      • Schadrach@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 days ago

        eventually they want to move on to the real thing, as porn is not satisfying them anymore.

        Isn’t this basically the same argument as arguing violent media creates killers?

    • quack@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 days ago

      Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.

      • 🎨 Elaine Cortez 🇨🇦 @lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 days ago

        I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 hours ago

          A more apt comparison would be people who go out of their way to hurt animals.

          Is it? That person is going out of their way to do actual violence. It feels like arguing someone watching a slasher movie is more likely to make them go commit murder is a much closer analogy to someone watching a cartoon of a child engaged in sexual activity or w/e being more likely to make them molest a real kid.

          We could make it a video game about molesting kids and Postal or Hatred as our points of comparison if it would help. I’m sure someone somewhere has made such a game, and I’m absolutely sure you’d consider COD for “fun and escapism” and someone playing that sort of game is doing so “in bad faith” despite both playing a simulation of something that is definitely illegal and the core of the argument being that one causes the person to want to the illegal thing more and the other does not.

    • misteloct@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      6 days ago

      Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.