It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.
Like the comments on this post here.
https://sh.itjust.works/post/6220815
I find this argument crazy. I don’t even know where to begin to talk about how many ways this will go wrong.
My views ( which are apprently not based in fact) are that AI CSAM is not really that different than “Actual” CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.
Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.
Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.
Using drugs has no inherent victim. And it is not predatory.
I could go on but im not an expert or a social worker of any kind.
Can anyone link me articles talking about this?
How do you not see how fucking offensive this is. A drawing is not really different from a REAL LIFE KID being abused?
The same way killing someone in a video game will cause harm?
The made up children? What the hell are you talking about?
No one sane is saying actually abusing kids is like a drug addiction. But you’re conflating pedophilia and assault. When it’s said pedophilia is like a drug addiction, it’s non offending pedophiles that is being discussed. Literally no one thinks assaulting kids is like a drug addiction. That’s your own misunderstanding.
About what exactly? There’s 0 evidence that drawings or fantasies cause people to assault children.
how are the drawings made
Generated by a computer by looking at some normal porn as well as (non-sexual) pictures of chiidren and trying to combine those?
Photoshopping a broad average of a child’s face onto a broad average of a naked person on top of the broad average of a child’s body in the broad average of a pornografic position.
Image generating AI can “learn” and apply concepts on top of each other, sometimes leading to otherworldly weirdness, sometimes resulting in realistic pictures of muppets being convicted during the Nuremberg trials.
I don’t get it, it seems many people want to condemn all forms of child porn, seemingly to avoid downvotes, because for some reason the internet community can’t see that AI generated images don’t harm anyone.
because it doesn’t happen if there isn’t evidence
._.
Yeah. People are way too hung up on there being evidence of stuff. It just FEELS right to you, right?
How shocking. The furry defining pedophilia and cartoons of people fucking kids.
Lol ok. No actual response to legitimate points that were not “defending” pedophiles, and I will 100% defend all cartoons. But just “lol furry pedophile”. Typical