Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
This just isn’t true. They will still be used to sexualise people, mostly girls and women, against their consent. It’s no different from AI-generated child pornography. It does harm even if no ‘real’ people appear in the images.
Fucking horrible world we’re forced to live in. Where’s the fucking exit?
It is different than AI-generated CSAM because real people are actually being harmed by these deepfake images.
I was replying to someone who was claiming they aren’t harmful as long as everyone knows they’re fake. Maybe nitpick them, not me?
Reak kids are harmed by AI CSAM normalising a problem they should be seeking help for, not getting off on.
deleted by creator
Not getting beyond your first sentence here. I am not interested in what fucked up laws have been passed. Nor in engaging with someone who wants to argue that any form of child porn is somehow OK.
deleted by creator
deleted by creator
Im addressing you because you made the claim they are equivalent when they clearly are not.
No I didn’t. Go nitpick someone else.
Or better still, explain why you think AI-generated CSAM isn’t harmful. FFS
Let’s be real here:
Sure, it’s not illegal. But if I find “those kinds” of AI-generated images on someone’s phone or computer, the fact that it’s AI-generated will not improve my view of that person in any possible way.
Even if it’s technically “legal”.
They tellin’ on themselves.
People who consume any kind of cp are dangerous and encouraging thar behavior is just as criminal. I’m glad that shit is illegal in most civilized countries.