A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

  • funkless_eck@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    75
    ·
    1 year ago

    “eh I’ll take a look”

    first thing I see is a woman on her back with her legs behind her head, smooth skin where her genitals should be and nipples in the middle of her buttocks.

    “alright then”

  • Fredselfish@lemmy.world
    link
    fedilink
    English
    arrow-up
    73
    ·
    1 year ago

    Could be interesting. I mean for one thing no real person being expolited and those with strange fetishist can be satisfied. But could be very disturbing as well. Wonder how long until AI video porn?

      • Magrath@lemmy.ca
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 year ago

        Are you talking about deep fakes? None of that is original. They just use their a face and AI matches it on to a pornstars body in an existing video.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I feel like we will start to see that more often. S complaint I see a lot here is that some people want extremely niche content that is very specific. If you have some people who get into those very specific poses and do those very specific things then run it through SD’s image to image feature you can make the person look sexy but have the same pose. The same way you have folks do mo cap for roles we might start seeing people specifically hired to do this sort of stand in.

    • guyrocket@kbin.social
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I think it is just a matter of time for the disturbing stuff to circulate. If people can create their darkest desires they will.

      Then cue debates and political campaigns about AI in general and if we should allow anyone or anything to create depraved images, pronographic or not.

      • gullible@kbin.social
        link
        fedilink
        arrow-up
        17
        arrow-down
        1
        ·
        1 year ago

        That’s so good, sissy. You got even better after I amputated your legs.

        “We don’t intend to police the use of developing technologies at this time.”

        That’s so good, sissy, blind that billionaire with your acidic piss.

        “We cannot allow our children to be exposed to such grotesque videos.”

      • Ryantific_theory@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        Yeah, although I think part of the missing nuance is that people already did that, the difference being that now anyone can, in theory, create what’s inside their head, regardless of their actual artistic talent. Now that creation is accessible though, everyone’s having another moral panic over what should be acceptable for people to create.

        If anything, moving the more disturbing stuff from the real world to the digital seems like an absolute win. But I suppose there will always be the argument, much like video games making people violent, that digital content will become real.

      • shortly2139@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        3
        ·
        1 year ago

        People have been able to draw for eons. Imagine trying to ban the public from drawing because the odd few people are a little mixed up.

        AI is just a tool, like the pencil, charcoal or paints.

        I know you aren’t suggesting we ban it from the public, just my side of the hypothetical debate that your right in saying will arrive

  • Touching_Grass@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    3
    ·
    1 year ago

    I’m not sure if I have the strength to clutch these pearls any harder over these articles. I peaked 8 articles ago.

    • threeduck@aussie.zone
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Weakling.

      I’ve been clutching so hard, the pearls fused into my flesh years ago. I’ve bankrupted myself buying more pearls, inserted one by one into my clenched fist.

      Luckily the mere sight of me - a lurching pearlescent beast with glinting pearls for eyes - causes clams to voluntarily offer their own in reverance, my own unending supply.

  • sramder@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    4
    ·
    1 year ago

    People who insist on real flesh porn will ultimately be viewed as weirdo’s out of touch with reality like people who insist everything sounds better on vinyl.

    Fast forward 25 years past the first Ai war and a ragged but triumphant humanity must rediscover the lost art of waxing.

    • Harpsist@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Why would I want to encourage the flesh trade where real women are hurt? And are limited to what humans are physically capable of?

      When I can have AI generated people who are able to do anything imaginable and no one gets hurt?

      They’ll be arguments that ‘once people get used to the fantasies they’ll want to try it in real life’ but we all know that that just isn’t true fr 40 years of video games. There hasn’t been any uptick in the events of people eating mushrooms and jumping on turtles or - what ever the fuck a goomba is -

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    When I first heard Stable Diffusion was going open source, I knew this would happen. The only thing I’m surprised at is that it took almost 2 years.

    • lloram239@feddit.de
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      It went quite a bit faster than that. StableDiffusion has only been out for about 13 months and this started about three months after that with Unstable Diffusion. What this article is reporting on is already quite a few months old and quite a bit behind what you can do with a local install of StableDiffusion/Automatic1111/ControlNet/etc. (see CivitAI).

  • Sume@reddthat.com
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    4
    ·
    1 year ago

    Not sure how people will be so into this shit. It’s all so generic looking

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      26
      ·
      1 year ago

      The actual scary use case for AI porn is that if you can get 50 or more photos of the same person’s face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make “generic looking” porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush’s face onto the Playboy centerfold, only with automated distribution over the Internet…

      • lloram239@feddit.de
        link
        fedilink
        English
        arrow-up
        19
        ·
        1 year ago

        Using a LoRA was the old way, these days you can use Roop, FaceSwapLab or ReActor, which not only can work with as little as a single good photo, they also produce better locking results than LoRA. There is no time consuming training either, just drag&drog an image and you get results in a couple of seconds.

        • pinkdrunkenelephants@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          7
          ·
          1 year ago

          So how will any progressive politician be able to be elected then? Because all the fascists would have to do is generate porn with their opponent’s likeness to smear them.

          Or even worse, deepfake evidence of rape.

          Or even worse than that, generate CSAM with their likeness portrayed abusing a child.

          They could use that to imprison not only their political opponents, but anyone for anything, and people would think whoever is being disappeared this week actually is a pedophile or a rapist and think nothing of it.

          Actual victims’ movements would be chopped off at the knee, because now there’s no definitive way to prove an actual rape happened since defendants could credibly claim real videos are just AI generated crap and get acquitted. No rape or abuse claims would ever be believed because there is now no way to establish objective truth.

          This would leave the fascists open to do whatever they want to anybody with no serious consequences.

          But no one cares because they want AI to do their homework for them so they don’t have to think, write, or learn to be creative on their own. They want to sit around on their asses and do nothing.

          • hyperhopper@lemmy.ml
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            People will have to learn to stop believing everything they see. This has been possible with Photoshop for even more than a decade now. All that’s changed is that it takes less skill and time now.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              1 year ago

              That’s not possible with AI-generated images impossible to distinguish from reality, or even expertly done photoshops. The practice, and generative AI as a whole, needs to be banned. They’re putting AI in photoshop too so ban that garbage too.

              It has to stop. We can’t allow the tech industry to enable fascism and propaganda.

                • pinkdrunkenelephants@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  4
                  ·
                  1 year ago

                  Nah, that Thanos I-am-inevitable shit doesn’t work on me. They can ban AI, you all just don’t want it because generative AI allows you to steal other people’s talent so you can pretend you have your own

              • CoolCat38@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                1 year ago

                Can’t tell whether this is bait or if you are seriously that much of a Luddite.

                • pinkdrunkenelephants@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  2
                  ·
                  1 year ago

                  Oh look at that, they just released pictures of you raping a 4-year-old, off to prison with you. Never mind they’re not real. That’s the world you wanted and those are the consequences you’re going to get if you don’t stop being lazy and learn to reject terrible things on ethical grounds.

          • Silinde@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            1 year ago

            Because that’s called Libel and is very much illegal in practically any country on earth - and depending on the country it’s either easy or trivial to put forth and win a case of libel in court, since it’s the onus of the defendant to prove what they said was entirely true, and “just trust me and this actress I hired, bro” doesn’t cut it.

              • Silinde@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                The burden of liability will then fall on the media company, which can then be sued for not carrying out due dilligance in reporting.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            We’re going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              But then people will say “Well how do we know they’re not lying?” and then it’s back to square 1.

              Victims might not ever be able to get justice again if this garbage is allowed to continue. Society’s going so off-track.

          • IHaveTwoCows@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            And they will respond to this fascist abuse by telling everyone to vote harder and donate money

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      6
      ·
      1 year ago

      AI is still a brand new tech. It’s like getting mad at AM radio for being staticy and low quality. It’ll improve with time as we get better tech.

      Personally I can’t wait to see what the future holds for AI porn. I’m imagining being able to get exactly what you want with a single prompt, and it looks just as real as reality. No more opening 50 tabs until you find the perfect video. Sign me the fuck up.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    “Are we ready”, in the sense that for now it’s 95% garbage and 5% completely generic but passable looking stuff? Eh.

    But, as this will increase in quality, the answer would be… who cares. It would suffer from the same major issues of large models : sourcing data, and how we decide the rights of the output. As for it being porn… maybe there’s no point in focusing on that specific issue.

  • themeatbridge@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    1 year ago

    Does it say something about society that our automatons are better at creating similated genitals than they are at hands?

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      1 year ago

      It says that we are biologically predisposed to sex, which we are, like animals, which we are.

      It doesn’t say anything about society, it just confirms the human condition.

    • lloram239@feddit.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      They suck quite a lot at genitals too. But what makes hands especially tricky is simply that they are pretty damn complex. A hand has five fingers that can all move independently, the hand can rotate in all kinds of way and the individual parts of a hand can all occlude each other. There is a lot of stuff you have to get right to produce a good looking hand and it is especially difficult when you are just a simple 2D algorithm that has little idea of 3D structure or motion.

    • Bop@lemmy.film
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      On a visual level, we are more interested in genitals than hands? Also, faces.

  • RBWells@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    1 year ago

    Meh. It’s all only women and so samey samey. Not sexy IMO, but I don’t think fake is necessarily not hot, art can be, certainly.

    • Zerfallen@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.

  • joelfromaus@aussie.zone
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    Went and had a look and it’s some of the funniest stuff I’ve seen all day! A few images come close to realism but a lot of them are the sort AI fever dream stuff that you could not make up.