as a victim of child sexual abuse it honestly pisses me off when i hear people describe ai generated images as “child sexual abuse material”
like, no, that’s not what it is. it can very obviously still be Bad, of course, but we still need to be respecting the fact that it is Different. do people not see how it is trivializing actual rape and abuse by comparing it to a made-up image a computer generated out of nothing? how every time i have to talk about Actual Honest To Goodness Real Life Pictures Of A Real Life Child Having Sex Crimes Done To Them In Real Life as a concept i have to add even more fucking qualifiers to lest people think i’m talking about drawings or deepfakes
it’s just fucking exhausting
oh and obviously it’s bad to make deepfake porn of someone who’s underage. it’s also frankly just bad to make deepfake porn of anyone no matter what age because that’s called “sexual harassment” and it doesn’t magically become okay when someone hits 18???
Ibly 🏳️⚧️
@EeveeEuphoria yeah, like do we really need to go and give jpegs magical crime powers? maybe we should be focusing more on actually fucking stopping sexual harassment and rape and all that instead of making up more and more arbitrary fantasy punishments that will never get used because none of it ever gets reported and the cops never arrest anyone for it
Ibly 🏳️⚧️
@eri @EeveeEuphoria they ended up lowering sentence minimums here because people used it for destroying people’s lives
@eri I'd be tempted to use "synthetic" or "simulated" CSAM for that particular category, simply because a lot of the models are trained on the horrible real stuff, and occasionally they're nigh-indistinguishable to an untrained eye (to the point NCMEC is actually having problems with it pulling resources away from investigating the real stuff)
While it's important to differentiate it from the base "CSAM" term, adding a qualifier that makes it clear that it might look like the real thing despite not directly involving actual children is preferable to coming up with yet another distinct term, IMO.
@eri i mean i see the importance of the distinction but idk what else to call it
@rachaelspooky “deepfake child pornography” or “AI-generated child pornography” would work fine. since the term “child sexual abuse material” describes the photographic and video materials that are produced by the sexual abuse of a child
@charlotte @EeveeEuphoria one quick trick to how any teenager with a camera can cause some real fucking headaches for everyone they know
@eri @EeveeEuphoria teens sending nudes to each other is probably the worst kept secret
oh, and everyone acts like “CSAM” is the only thing that matters because nobody wants to admit that we don’t do shit to actually prevent sexual abuse and we don’t do shit to help the victims of it afterwards either. you can pretend that you’re helping by going crazy about some fucking jpegs; when in reality kids and teens are still getting abused even if all the cameras are turned off. meanwhile you harass victims until they kill themselves for not acting how you want them to, and then the perpetrator gets elected fucking President or whatever.
that’s the world we’re in! everyone ignores the actual foundational problems, and instead they pretend to “help” in a way that either does nothing or makes things worse after the fact because they don’t want to admit that they ignored the problem on purpose at the start.
in a way, it’s not much different to us than how the pedos that jerked off to us a decade ago treated us. all you people see are the photos; you ignore the person who’s life was ruined by them.
@eri watching grooming victims get labeled as pedophiles and blocked for things said under the influence of grooming (and hence cutting them off from non-coercive human connection), and then having that label follow them even after they escape from the grooming situation, is one of the most depressing things i’ve ever seen.
its so obvious how much people operate purely on feelings of disgust and taintedness and just how much this works against their stated values
@eri there is only bad or good. there is no nuance, there is no gradient of harm. there can be no mitigating circumstances.
there is only ick and disgust.
@eri Apologies. I don't know you, but this appeared on my feed and I couldn't not say something.
AI-generated imagery doesn't appear from the void. AI-generated CSAM exists because AI models have been trained on real CSAM. Generating & distributing AI CSAM is therefore morally equivalent to distributing actual CSAM because it's still just as much a product of actual abuse - the act of running it through an AI model doesn't change that.
@eri Obviously proliferation of CSAM is not equivalent to physical CSA, but we must treat them the same because the former creates a market for the latter.
AI CSAM ~= CSAM ~= CSA
If society doesn't treat the ones on the left as equally morally repugnant as the one on the right, it encourages the one on the right.
@jsbarretto at this point I’m just kind of in awe at how well you’ve proved my point by entirely missing it. good job
@erincandescent i just decided to be entirely evil it’s a lot easier and i get to have more fun :)
@eri What point? There is no such thing as 'a made-up image generated from nothing', as you claim.
AI models produce fake outputs that look like things because they're trained on real inputs that look like things.
Your argument basically amounts to claiming that "if the meat-grinder is long enough, we can pretend the sausage that comes out isn't made from the bits of pig that went in". That's not how this works.