Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

    • nudny ekscentryk
      link
      English
      -110 months ago

      it very much is:

      OP: In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

      parent reply: Thats why we need Blockchain Technology

      • @fiah@discuss.tchncs.de
        link
        fedilink
        English
        3
        edit-2
        10 months ago

        a discussion can have multiple, separate threads with branching topics, that’s what this threaded comment system is specifically made to facilitate

        • nudny ekscentryk
          link
          English
          010 months ago

          okay, let’s rethread how we got here:

          OP: Spanish girls report AI porn of them circulating

          parent comment: Blockchain could fix this

          1-st level reply: Blockchain can’t counteract fake porn being created

          2-nd level reply: it lets you verify original source

          3-rd level reply: if anything it lets you verify integrity between sources

          you: if a central authority can’t be trusted to verify sources then Blockchain can

          me: it’s not about verifying provenance of the material but rather its mere existence in the world

          you: we can store the fingerprint of the file in a trusted database

          me: but this doesn’t affect the material’s existence

          you: you’re going off-topic!

          me: I am not

          you: this conversation can have multiple threads

          can you now see how it’s you who’s off the rails in this conversation? noone ever questioned how blockchain could allow verifying any piece’s of media authenticity, but spreading forged, nonconsensual erotica is NOT about proving whether a photo or video in question is authentic; the problem is that people have got tools to do so in the first place, and before a victim can counteract and prove (using blockchain if you will) that a particular photo is a forgery, the damage is done regardless

          • @fiah@discuss.tchncs.de
            link
            fedilink
            English
            010 months ago

            okay, let’s rethread how we got here:
            OP: Spanish girls report AI porn of them circulating
            parent comment: Blockchain could fix this

            you’re missing a step there, buddy. I know, it’s hard, let me make it a bit easier for you by drawing a picture:

            “blockchain can fix this” was never about preventing AI porn from being spread, it’s about the general problem of knowing whether something was authentic, hence their choice to reply to that comment with that article

            • nudny ekscentryk
              link
              English
              110 months ago

              Again, for the sixth or whichever time: this has nothing to do with the clou of the problem

              • @papertowels@lemmy.one
                link
                fedilink
                English
                210 months ago

                …you’re right, it has nothing to do with nudes because it’s talking about an entirely different problem of court-admissable evidence.

              • @fiah@discuss.tchncs.de
                link
                fedilink
                English
                110 months ago

                yes, you’re right, it doesn’t, because we weren’t talking about that. “blockchain” can’t do anything to help kids from having AI generated naked pictures of them being spread, and nobody here claimed otherwise