Robin Williams’ daughter Zelda says AI recreations of her dad are ‘personally disturbing’::Robin Williams’ daughter Zelda says AI recreations of her dad are ‘personally disturbing’: ‘The worst bits of everything this industry is’

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    6
    ·
    1 year ago

    imaginary scenario:

    you love good will hunting, you’re going thru a tough time, and you use AI to have robin williams say something gentle and therapist-y that directly applies to you and your situation – is this wrong?

    • Naz@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      4
      ·
      1 year ago

      I’ve asked extremely high end AI questions on ethics of this nature and after thinking for exactly 14.7 seconds it responded with:

      • The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

      • However, spreading those images to others, without the original person’s consent is considered a form of invasion of privacy, impersonation, and is therefore unethical.

      Basically, you’re fine with imagining Robin Williams talking to you, but if you record that and share it with others/disseminate the content, then it becomes unethical.

      • TwilightVulpine@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        5
        ·
        1 year ago

        • The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

        That doesn’t sound right at all. Copying and processing somebody’s works for the sake of creating a replica is completely different than imagining it to yourself. Depending on how its done, even pretending that it’s being done solely for yourself is incorrect. Many AI-based services take feedback from what their users do, even if they don’t actively share it.

        Just like looking at something, memorizing it and imitating it is allowed while taking a picture may not be, AI would not necessarily get the rights to engage with media as people do. It’s not an independent actor with personal rights. It’s not an extension of the user. It’s a tool.

        Then again I shouldn’t be surprised that an AI used and trained by AI users, replies about its use as basically a natural right.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Please see the second point. Essentially you cannot commit copyright violation if you don’t distribute anything. Same concept.

          • TwilightVulpine@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            These AIs are not being produced by the rights owners so it seems unlikely that they are being built without unauthorized distribution.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I get your point, but I think for the purpose of the thought exercise having the model built by yourself is better to get at the crux of “I am interested in making an image of a dead celebrity say nice things to me” especially since the ethics of whether or not building and sharing models of copyrighted content is a totally different question with its own can of worms.

    • Empricorn@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I wouldn’t apply morality, but I bet it isn’t healthy. I would urge this theoretical person to consult with an actual licensed therapist.