this post was submitted on 01 Apr 2025
191 points (99.0% liked)
Technology
68304 readers
4406 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm no pedo, but what you do in your own home and hurts nobody is your own thing.
Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.
So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.
Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal adult porn, and then it can put those together.
This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.
Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:
Technically speaking, if you post images of your child on social media, you have consented. If you never uploaded an image of your child online, you never need to worry.
Social media has been around a long time. It is not reasonable to expect people to think of technology they can’t imagine even existing ten years in the future when “consenting” to use a platform. Legally you are correct. Morally this is obviously terrible. Everything about how terms and conditions are communicated is designed to take advantage of people who won’t or are unable to parse its meaning. Consent needs to be informed.
Even when consent is informed it can still be fucky. Do you think I want to consent to an arbitration agreement with my employer or a social media platform? Fuck no, but I want a job and interaction so I go where the money/people are. I can't hunt around for a place that will hire me and also doesn't have arbitration.
Consent at the barrel of a gun, No matter how well informed, is no consent at all.
This is a great point. Manufactured consent and all.
In many countries mandatory arbitration agreements in a B2C context are invalid. They have no legal power.
Ngl this feels like arguing semantics.
Fair enough. I still think it shouldn't be allowed though.
Why? Not pressing but just curious what the logic is
I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.
That's fair, but I still think it shouldn't be accepted or allowed.
It seems pretty understandable that companies wouldn't allow it, it's more that if it is illegal (like in some places) then that gets into really sketchy territory imo.
I agree it shouldn't be accepted, but I disagree on being allowed. I think it should be allowed because it doesn't hurt anyone.
Ah, right, almost finally forgot the killer games rhetoric.
I also don't agree with the killer games thing, but humans are very adaptable as a species.
Normally that's a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn't it possible that people might slowly get desensitized to it over time?
But what if pedophiles in therapy are less likely to commit a crime if they have access to respective porn? Even better then, if it can be AI generated, no?