this post was submitted on 25 Oct 2023
125 points (100.0% liked)

Technology

37712 readers
171 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] coffeejunky@beehaw.org 12 points 1 year ago (3 children)

Yeah exactly, I don't want to see it but the same goes for a lot of weird fetishes.

As long as no one is getting hurt I don't really see the problem.

[–] BarryZuckerkorn@beehaw.org 17 points 1 year ago

As long as no one is getting hurt I don’t really see the problem.

It'd be hard to actually meet that premise, though. People are getting hurt.

Child abuse imagery is used as both a currency within those circles to incentivize additional distribution, which means there is a demand for ongoing and new actual abuse of victims. Extending that financial/economic analogy, seeding that economy with liquidity, in a financial sense, might or might not incentivize the creation of new authentic child abuse imagery (that requires a child victim to create). That's not as clear, but what is clear is that it would reduce the transaction costs of distributing existing child abuse imagery, which is a form of re-victimizing those who have already been abused.

Child abuse imagery is also used as a grooming technique. Normalization of child sexual activity is how a lot of abusers persuade children to engage in sexual acts. Providing victimless "seed" material might still result in actual abuse happening down the line.

If the creation of AI-generated child abuse imagery begins to give actual abusers and users of real child abuse imagery cover, to where it becomes more difficult to investigate the crime or secure convictions against child rapists, then the proliferation of this technology would make it easier to victimize additional children without consequences.

I'm not sure what the latest research is on the extent to which viewing and consuming child porn would lead to harmful behavior down the line (on the one hand, maybe it's a less harmless outlet for unhealthy urges, but on the other hand, it may feed an addictive cycle that results in net additional harm to society).

I'm sure there are a lot of other considerations and social forces at play, too.