Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • fiah@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    1 year ago

    okay, let’s rethread how we got here:
    OP: Spanish girls report AI porn of them circulating
    parent comment: Blockchain could fix this

    you’re missing a step there, buddy. I know, it’s hard, let me make it a bit easier for you by drawing a picture:

    “blockchain can fix this” was never about preventing AI porn from being spread, it’s about the general problem of knowing whether something was authentic, hence their choice to reply to that comment with that article

    • nudny ekscentryk
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      Again, for the sixth or whichever time: this has nothing to do with the clou of the problem

      • papertowels@lemmy.one
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        …you’re right, it has nothing to do with nudes because it’s talking about an entirely different problem of court-admissable evidence.

      • fiah@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        yes, you’re right, it doesn’t, because we weren’t talking about that. “blockchain” can’t do anything to help kids from having AI generated naked pictures of them being spread, and nobody here claimed otherwise