• horncorn@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    6
    ·
    10 months ago

    Article title is a bit misleading. Just glancing through I see he texted at least one minor in regards to this and distributed those generated pics in a few places. Putting it all together, yeah, arrest is kind of a no-brainer. Ethics of generating csam is the same as drawing it pretty much. Not much we can do about it aside from education.

    • ricecake@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      9 months ago

      Legally, a sufficiently detailed image depicting csam is csam, regardless of how it was produced. Sharing it is why he got caught, inevitably, but it’s still illegal even if he never brought a minor into it.

    • retrospectology@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      32
      ·
      edit-2
      10 months ago

      Lemmy really needs to stop justifying CP. We can absolutely do more than “eDuCaTiOn”. AI is created by humans, the training data is gathered by humans, it needs regulation like any other industry.

      It’s absolutely insane to me how laissez-fair some people are about AI, it’s like a cult.

      • Autonomous User@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        9 months ago

        One of two classic excuses, virtue signalling to hijack control of our devices, our computing, an attack on libre software (they don’t care about CP). Next, they’ll be banning more math, encryption, again.

        It says gullible at the start of this page, scroll up and see.

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        You don’t need CSAM training data to create CSAM images. If your model knows how children looks like, how naked human bodies look like, then it can create naked children. That’s simply how generative models like this work and has absolutely nothing to do with specifically trained models for CSAM using actual CSAM material.

        So while I disagree with him, in that lack of education is the cause of CSAM or pedophilia… I’d say it could help with the general hysteria about LLMs, like the one’s coming from you, who just let their emotions run wild when those topics arise. You people need to understand that the goal should be the protection of potential victims, not the punishment of victimless thought crimes.