

There’s an argument to be made that if the system was trained on real CSAM, then using it to generate such imagery would be immoral - but otherwise I don’t think it is, and this feels like a moral panic.
CSAM is by definition evidence of a crime having happened. You can’t create it without hurting a real human being - that’s why it’s illegal. That logic doesn’t apply to simulated images or cartoons. It might be in bad taste, but nobody was hurt in the making of it, and I’m not aware of any solid evidence that viewing such content makes someone more likely to commit the real crime. Same as there’s no proven link between violent movies/games and increased real-world violence.
There’s really no limit to how far this can be taken. In the past the line was clear: was a child hurt? If yes, illegal. Now we’re effectively moving toward banning violent video games and cartoons. Tomorrow it’s stick-figure fight scenes, and soon you’re not even allowed to think about it.
Of course I’m being hyperbolic here - just trying to make a point. I don’t think “I don’t like it” is justification for banning something if it can’t be shown to cause actual harm. If solid science ever proves it increases the likelihood of offending against real humans, then yeah, that’s different. But I don’t think we have that evidence. Even most pedophiles never offend. The vast majority of people in prison for child sexual abuse are just plain old rapists with no particular fixation on kids - they’re simply easy targets.
My parents’ porn VHS collection didn’t ask for my age and neither did my grand-dad’s titty magazines hidden in the tractor shed. Internet wasn’t even a thing for most people then yet I had already seen plenty of porn before I turned 10.
It clearly did affect me and I don’t deny that but I doubt it would’ve made much of a difference had I have to wait untill I was 18 for the flood gates to open. Nobody would’ve seen me for months if that was the case.