this post was submitted on 26 Dec 2023
108 points (81.4% liked)

Technology

60058 readers
1890 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Study shows AI image-generators being trained on explicit photos of children::Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built

you are viewing a single comment's thread
view the rest of the comments
[–] giggling_engine@lemmy.world 5 points 1 year ago (2 children)

Honest question, why is this a problem rather than a solution?

If these kids don't exist, and having those fake pictured make some people content, what's the harm?

Kinda reminds me of furries getting horny over wolf drawings, who cares?

[–] Supermariofan67@programming.dev 2 points 1 year ago* (last edited 1 year ago)

I agree with you in instances where it's not generating a real person. But there are cases where people use tools like this to generate realistic-looking but fake images of actual, specific real-life children. This is of course abusive to that child. And it's still bad when it's done to adults too, it's sort of a form of defamation.

I really do hope legislation around this issue is narrowly tailored to actual abuse similar to what I described above, but given the "protect the children" nonsense they constantly moan about just about every technology including end to end encryption I'm not very optimistic.

Another thing I wonder about, is if AI could get so realistic that it becomes impossible to prove beyond a reasonable doubt that anyone with actual CSAM (where the image/victim isn't known so they can't prove it that way) is guilty, since any image could plausibly be fake. This of course is an issue far beyond just child abuse. It would probably discredit video footage for robberies and that sort of thing too. We really are venturing into the unknown and government isn't exactly know for adapting to technology...

But I think you're mostly correct, because the moral outrage on social media seems to be about the entire concept of fake sexual depictions of minors existing at all, rather than only about abusive ones

[–] Nommer@sh.itjust.works -3 points 1 year ago (1 children)

Did you just compare furries to pedophiles? One of those is harmless, the other is not.

[–] giggling_engine@lemmy.world 7 points 1 year ago (1 children)

Ironically, I was giving them as an example of something OK. My point just went over your head.

[–] Nommer@sh.itjust.works -5 points 1 year ago (2 children)

No, it didn't but it seems like mine went over yours. Furries usually are fine outside a few bad actors but pedophiles are mentally ill and should not be allowed them to generate AI CSAM just to satisfy them. They should be seeking help, not jerking it to fake kiddies.

[–] JackGreenEarth@lemm.ee -1 points 1 year ago (1 children)

People used to say trans people were mentally ill. If they're not harming actual children, what's it to you what their fetish is?

[–] Nommer@sh.itjust.works 0 points 11 months ago (1 children)

Stop trying to justify your mental illness. You need to seek help if you think fucking kids is okay.

[–] JackGreenEarth@lemm.ee 0 points 11 months ago

I don't think that. Obviously actually having sex with children is wrong. I was saying that generating AI child porn with no actual harm to children involved can't be logically argued to be immoral.