It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.
Like the comments on this post here.
https://sh.itjust.works/post/6220815
I find this argument crazy. I don’t even know where to begin to talk about how many ways this will go wrong.
My views ( which are apprently not based in fact) are that AI CSAM is not really that different than “Actual” CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.
Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.
Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.
Using drugs has no inherent victim. And it is not predatory.
I could go on but im not an expert or a social worker of any kind.
Can anyone link me articles talking about this?
You can try to look for research, but most scientific research is either ethicists debating the issue, or scientists analysing the impact of child abuse and the effectiveness of harsh punishments. There is also plenty of research in the cycle of abuse (many paedophiles have been abused as a child themselves) and the general impact of abuse on children and adults.
There are no good papers about real world effects of virtual CSAM consumption. You can’t just give one population of paedophiles a load of child porn and check if they rape fewer or more kids. At best you can research the people that do get caught, usually because they’re raping kids or at least are trying to.
A lot of research is done by action groups that may as well be called “kill all paedos”. Obviously their goals are laudable, but they’re not exactly independent researchers. Their goal isn’t “we need to understand what’s driving these people” but “we need to stop these people”. It’s like asking the Catholic Church to research homosexuality, you’re not going to get useful scientific information out of it.
You’re also not going to get decent research on paedos through most public institutions. Imagine being a research subject and walking up to the receptionist like “hello, I’m a volunteer for your paedophilia research”. If you weren’t on a watch list before, you definitely would be now. I don’t think we’ll ever get the answers we need.
If you think AI models are trained out of preexisting CSAM, you don’t seem to get why modern AI models are so revolutionary. The whole point of stable diffusion and later generations is that you don’t need the thing you’re trying to generate to have an equivalent in the dataset.
You can combine child+nude the same way you can combine hotdog+wings. You don’t need pictures of winged hotdogs to generate them out of AI models.
I’m sure there are paedophiles that will use their existing collection to train the models further, but you need quite the GPU power and technical know-how for that. This isn’t an app you can just drag pictures into, it’s a process that takes a lot of time.
I don’t see the point of generating those images anyway, modern image models are complex enough that you don’t need the extra training.
That’s a pretty stupid take. Pedophilia can have different reasons, but they all stem from either mental illness or the same mechanisms that make people gay. You don’t get a free hit of child porn from your friends and the doctors won’t prescribe you child rape when you’re having medical issues either.
You can’t just stop being attracted to kids, just like you can’t just decide you’re into men/women now. The difference is that bi/homosexuality isn’t a problem whereas paedophilia is.
There are also paedos who do it for some kind of power dynamic, they don’t care who they abuse, as long as they’re weak and defenceless. Those people are sick in the head and need treatment, or they need to be removed from society. Either way, their intentions aren’t child specific.
Neither does jerking it to computer generated pictures. No animals are harmed when I generate a picture of Mickey Mouse butchering a pig.
Besides, the drug world is full of violence, even in the legal circuit. Drugs is how gangs and regimes all around the world make money. If you consider CSAM consumption to be indirect support to child abuse, you should definitely consider drug consumption to be indirect support for gang violence.
Lastly, there’s a very troubling thing I’ve noticed the majority isn’t willing to talk about: there are so, so many people out there who are attracted to kids. Not prepubescent kids, but very few 14 to 16 year old girls will not have had men approach them with sexual comments. The United States of America voted against making child marriage illegal. The amount of “I’ll just fuck this behaviour out of her” you can find online about Greta Thunberg from even before she was an adult is disturbing; people with full name and profile pictures on Facebook will sexualise and make rape threats to a child because she said something they didn’t like. There’s a certain amount of paedophilia that just gets overlooked and ignored.
Even worse, those people aren’t included in research into paedophilia because of how “tolerated” it is. The ones that get caught and researched are the sickos who abuse tens or hundreds of children, but the people who will marry a child won’t be.
Bottom line: this isn’t something you can just Google for to find an answer, the issue is just too recent. I can take an off the shelf image generation model and generate CSAM even when none of the training set contains any of set material. No children will be harmed, yet the resulting imagery is obviously illegal. Abuse-free CSAM is going to be a massive headache for governments and lawyers the coming years.
Very good comment all around, I just have a nitpick to this section:
This is actually called hebephilia/ephebophilia which is in the general public treated very similarly and often subsumed under the term pedophilia. It is considered it’s own thing though. To quote Wikipedia:
My guess for why it is more tolerated than straight up pedophilia is that they have reached a more mature body, that shows some/most properties of a sexually developed person. So while it’s still gross and very likely detrimental to the child if pursued (depends on the age in question, 16-18 is pretty close to adulthood), there seems to be more of an understanding for it.
[This comment has been deleted by an automated system]
Being attracted to a pre-puberty or early-puberty child is not only considered wrong because they can’t consent, it’s also considered abnormal because they do not share any features of what a “normal” person would be attracted to, namely developed physical sexual traits. I don’t think there is anything being muddied here.
The physical attraction part gets muddier the more puberty progresses. There isn’t really an age limit for this as puberty works differently for everyone. The psycological/consent part gets muddier the more the age progresses combined with the changes puberty does to your personality, but it also depends on a ton of other factors, like the kind of upbringing in terms of sex-ed. There is a reason that the age of consent differs vastly even between US states and even more so internationally, even if you only include western europe.
So this might be your opinion, many other people would say otherwise, it’s not a hard fact. Especially if you go up to 16 where we allow people of this age to do all sorts of things. In USA you can drive a car, in germany you can buy and consume alcohol, they are sometimes already in an apprenticeship to get into a job. People generally start becoming people and stop being kids somewhere in that range.
So while bringing this distinction up muddies the water, it muddies the water only so far as it is already muddy, and this needs to be part of the conversation if it should have a relation to reality.
So in conclusion I don’t fully agree here. It’s not the same, one is way worse than the other. That doesn’t make it ok to get what you want through abuse from a 16 year old or wherever you want to set the age limit. Or from anyone for that matter, but younger people need to be better protected, because typically they are easier to abuse. Where that age limit is exactly, is somewhat a matter of opinion, as the different laws show.