A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
Then also every artist creating loli porn would have to be jailed for child pornography.
In some countries drawn child porn is illegal as well.
But this is the US… and its kind of a double standard if you’re not arrested for drawing but for generating it.
There is a difference between something immediately identifiable as a drawing and something almost photorealistic. If a generated image is indistinguishable from a real photo, it should be treated the same.
The core reason CSAM is illegal is not because we don’t want people to watch it but because we don’t want them to create it which is synonymous with child abuse. Jailing someone for drawing a picture like that is absurd. While it might be of bad taste, there is no victim there. No one was harmed. Using generative AI is the same thing. No matter how much simulated CSAM you create with it, not a single child is harmed in doing so. Jailing people for that is the very definition of a moral panic.
Now, if actual CSAM was used in the training of that AI, then it’s a more complex question. However it is a fact that such content doesn’t need to be in the training data in order for it to create simulated CSAM and as long as that is the case it is immoral to punish people for creating something that only looks like it but isn’t.
It could be argued that even drawn imagery can inspire and encourage later real world abuse.
Sure, but same argument could be made of violent movies / games / books … It’s a rather slippery slope and as far as I know there doesn’t seem to be correlation between violent games and real life violence, in fact I believe the correlation is negative.
Hmm, it seems to a bit unclear how the damage and risk is.
https://en.m.wikipedia.org/wiki/Relationship_between_child_pornography_and_child_sexual_abuse
I don’t see a slippery slope here.
I don’t advocate for either but it should NOT be treated the same. one doesn’t involve a child being involved and traumatized, id rather a necrophiliac make ai generated pics instead of… you know.