Combating CSAM is great and all, but something tells me this will also be used for far more sinister purposes.
Always has, always will be unfortunately. It’s a classic “Think of the children” change
Does this really even matter for combatting this? were pedos really so stupid that they were putting their shit on the cloud?
I’m all for stopping pedophiles but this seems like a scary breach of privacy, is an apple person going to look at my young-looking but completely legal nudes? This seems scary.
Agreed, my penis could easily be mistaken for a child’s
That’s reddit type funny comment, and i hope this type of humor will transfer here with reddit refugees
Lemmy is maturing! More than your penis apparently.
I agree.
For any detection logic one has to take the video, .i.e. the series of images in, feed to logic of code, and return results. For a hidden or very big program, the code can as well be,
if nudity send_to_apple_server() print(“We detected nudity, and flagged this video”)
The user cannot differentiate it from well intended code. The right thing to do is not track at all! No “SMART” logic to “HELP”!
@Tretiak i wonder when will they expand it to combat establishment dissent and free speech in general?
Don’t worry, most social media’s already do that free of charge 😁
This would scan regardless of whether iCloud is enabled or not. But only for minors. Correct?
Once the capabilities exist, how hard would it be for future fascist regimes to tell Apple to turn it on for whatever other purposes?
Under His Eye. Blessed be the fruit.
If you believe Apple then yes.