‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • deft@ttrpg.network
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    1 year ago

    Fully agree but I do think that’s more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they’re kids you know?

    • Eezyville@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      It shouldn’t be a big deal if they choose to be nude some place that is private for them and they’re comfortable. The people who are using this app to make someone nude isn’t really asking for consent. And that also brings up another issue. Consent. If you have images of yourself posted to the public then is there consent needed to alter those images? I don’t know but I don’t think there is since it’s public domain.