Conversation
AI and child sexual abuse, hot take
Show content

as a victim of child sexual abuse it honestly pisses me off when i hear people describe ai generated images as “child sexual abuse material”

like, no, that’s not what it is. it can very obviously still be Bad, of course, but we still need to be respecting the fact that it is Different. do people not see how it is trivializing actual rape and abuse by comparing it to a made-up image a computer generated out of nothing? how every time i have to talk about Actual Honest To Goodness Real Life Pictures Of A Real Life Child Having Sex Crimes Done To Them In Real Life as a concept i have to add even more fucking qualifiers to lest people think i’m talking about drawings or deepfakes

it’s just fucking exhausting

6
2
2
re: AI and child sexual abuse, hot take
Show content

oh and obviously it’s bad to make deepfake porn of someone who’s underage. it’s also frankly just bad to make deepfake porn of anyone no matter what age because that’s called “sexual harassment” and it doesn’t magically become okay when someone hits 18???

1
0
2
re: AI and child sexual abuse, hot take
Show content
@eri god this has been my take too

once again, the abuse that has happened to me is being compared to pixels on a screen, of a child that has never existed

and naturally countries that are passing laws banning this AI """"CSAM"""" are being used to also ban artwork in general. wow, who would've thought that the people banning fake stuff wants to ban other fake stuff.

and like yeah. deepfakes fucking suck on a moral level, of course they do. but christ stop comparing my real lived experiences to bullshit like this
1
0
1
re: AI and child sexual abuse, hot take
Show content

@EeveeEuphoria yeah, like do we really need to go and give jpegs magical crime powers? maybe we should be focusing more on actually fucking stopping sexual harassment and rape and all that instead of making up more and more arbitrary fantasy punishments that will never get used because none of it ever gets reported and the cops never arrest anyone for it

3
0
2
re: AI and child sexual abuse, hot take
Show content
@eri ah but you see, that'd end up getting a bunch of people in power arrested, and we don't want that
0
0
1
re: AI and child sexual abuse, hot take
Show content

@eri @EeveeEuphoria they ended up lowering sentence minimums here because people used it for destroying people’s lives

1
0
3
"AI CSAM" terminology discussion, possible hot take
Show content

@eri I'd be tempted to use "synthetic" or "simulated" CSAM for that particular category, simply because a lot of the models are trained on the horrible real stuff, and occasionally they're nigh-indistinguishable to an untrained eye (to the point NCMEC is actually having problems with it pulling resources away from investigating the real stuff)

While it's important to differentiate it from the base "CSAM" term, adding a qualifier that makes it clear that it might look like the real thing despite not directly involving actual children is preferable to coming up with yet another distinct term, IMO.

0
0
0
re: AI and child sexual abuse, hot take
Show content

@eri i mean i see the importance of the distinction but idk what else to call it

1
0
0
re: AI and child sexual abuse, hot take
Show content

@rachaelspooky “deepfake child pornography” or “AI-generated child pornography” would work fine. since the term “child sexual abuse material” describes the photographic and video materials that are produced by the sexual abuse of a child

0
0
0
re: AI and child sexual abuse, hot take
Show content

@charlotte @EeveeEuphoria one quick trick to how any teenager with a camera can cause some real fucking headaches for everyone they know

1
0
0
re: AI and child sexual abuse, hot take
Show content

@eri @EeveeEuphoria teens sending nudes to each other is probably the worst kept secret

0
0
1
just PTSD ranting at this point. child sexual abuse, objectification, suicide mention
Show content

oh, and everyone acts like “CSAM” is the only thing that matters because nobody wants to admit that we don’t do shit to actually prevent sexual abuse and we don’t do shit to help the victims of it afterwards either. you can pretend that you’re helping by going crazy about some fucking jpegs; when in reality kids and teens are still getting abused even if all the cameras are turned off. meanwhile you harass victims until they kill themselves for not acting how you want them to, and then the perpetrator gets elected fucking President or whatever.

that’s the world we’re in! everyone ignores the actual foundational problems, and instead they pretend to “help” in a way that either does nothing or makes things worse after the fact because they don’t want to admit that they ignored the problem on purpose at the start.

in a way, it’s not much different to us than how the pedos that jerked off to us a decade ago treated us. all you people see are the photos; you ignore the person who’s life was ruined by them.

1
0
0
re: AI and child sexual abuse, hot take
Show content
@eri @EeveeEuphoria
> magical crime powers

the way people talk about it really does seem to necessitate an underlying belief that the image is literally an icon, a symbol in the mystic sense, that ontologically brings the viewer into communion with what it depicts

http://orthodoxinfo.co...
> The second stage is embodied in the New Testament, which is characterized by the iconic (by image). Here we have the "true form [eikon, or icon] of these realities." The third stage of this relationship will, of course, be the Kingdom of God to come, in which man will see reality itself, "face to face." Clearly, with regard to iconography, the "symbolic" can occupy only a secondary position, since the significant quality of an icon par excellence is the fact that it constitutes a real image of that which it depicts. The image is in some way a "true" form of the prototype, participating in it and integrally bound to it.
0
0
1
re: AI and child sexual abuse, hot take
Show content
@eri "AI-generated / artistic depiction is not actual CSAM" is like "the emperor is naked" but instead of merely being unfit for station they think you're a lolisho gooner for saying it

> and it doesn’t magically become okay when someone hits 18???
i'm reminded again of how i keep wanting to add CSAM to the big list of why genAI embodies everything that was ever bad about the internet https://kill-corporati... but never did because every potential example ends up being better described as some other bad thing that's already on the list
0
0
1
re: just PTSD ranting at this point. child sexual abuse, objectification, suicide mention
Show content

@eri watching grooming victims get labeled as pedophiles and blocked for things said under the influence of grooming (and hence cutting them off from non-coercive human connection), and then having that label follow them even after they escape from the grooming situation, is one of the most depressing things i’ve ever seen.

its so obvious how much people operate purely on feelings of disgust and taintedness and just how much this works against their stated values

1
0
1
re: just PTSD ranting at this point. child sexual abuse, objectification, suicide mention
Show content

@eri there is only bad or good. there is no nuance, there is no gradient of harm. there can be no mitigating circumstances.

there is only ick and disgust.

1
0
1
re: AI and child sexual abuse, hot take
Show content

@eri Apologies. I don't know you, but this appeared on my feed and I couldn't not say something.

AI-generated imagery doesn't appear from the void. AI-generated CSAM exists because AI models have been trained on real CSAM. Generating & distributing AI CSAM is therefore morally equivalent to distributing actual CSAM because it's still just as much a product of actual abuse - the act of running it through an AI model doesn't change that.

1
0
0
re: AI and child sexual abuse, hot take
Show content

@eri Obviously proliferation of CSAM is not equivalent to physical CSA, but we must treat them the same because the former creates a market for the latter.

AI CSAM ~= CSAM ~= CSA

If society doesn't treat the ones on the left as equally morally repugnant as the one on the right, it encourages the one on the right.

1
0
0
re: AI and child sexual abuse, hot take
Show content

@jsbarretto at this point I’m just kind of in awe at how well you’ve proved my point by entirely missing it. good job

1
0
0

@erincandescent i just decided to be entirely evil it’s a lot easier and i get to have more fun :)

0
0
0
re: AI and child sexual abuse, hot take
Show content

@eri What point? There is no such thing as 'a made-up image generated from nothing', as you claim.

AI models produce fake outputs that look like things because they're trained on real inputs that look like things.

Your argument basically amounts to claiming that "if the meat-grinder is long enough, we can pretend the sausage that comes out isn't made from the bits of pig that went in". That's not how this works.

1
0
0
re: AI and child sexual abuse, hot take
Show content
0
0
0