• barsoap@lemm.ee
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Knowing what a nude adult looks like doesn’t mean that the model knows what a nude child looks like. I’m quite sure it’s easy to generate disturbing images like that, but actual paedophiles I think won’t be satisfied with child faces on small adult bodies.

    Ordinary deepfakes actually have a very similar problem: Sure you can take a picture of a celebrity and tell the AI to undress them – but it won’t be their actual body. The AI is going to be able to approximate their overall build but it’s going to be a generic adult body, not the celebrity’s body. Or, differently put, AI models aren’t any better at undressing people with their eyes than teenagers.

    • Amju Wolf@pawb.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I see where you’re coming from but that’s a technical issue that will probably be solved in time.

      It’s also really not a black and white; sure maybe you can see it isn’t perfect but you’d still prefer it to content where you know no one was actually harmed.

      Despite what reputation people like that have (due to the simple fact of how reporting works), most are harmless like me and you and don’t actually want to see innocent people suffer and would never act on their desires. So having a safe and harmless outlet might help.

      • barsoap@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 year ago

        I see where you’re coming from but that’s a technical issue that will probably be solved in time.

        You cannot create information from nothing.

        So having a safe and harmless outlet might help.

        Psychologists/Psychiatrists are still on the fence on that one, I wouldn’t be surprised if it depends on the person. And yes the external harm produced by AI images is definitely lower than that produced from actual CSAM, doubly so newly produced CSAM, but that doesn’t mean that therapy, even in its current early stages, couldn’t do even better.

        Differently put: We may be again falling into the trap of trying to find technological solutions to societal problems (well, this is /c/technology…). Which isn’t to say that we shouldn’t care at all about models trained on CSAM, but that’s addressing symptoms, not causes. Ultimately addressing root causes is more important: The vast majority of paedophiles are not exclusive paedophiles, often they’re not even really attracted to kids at all beyond having developed a fetish, they’re rapists focussing on the most vulnerable, often due to having been victims of sexual abuse themselves.